![]() |
![]() |
Your cart is empty |
||
Books > Reference & Interdisciplinary > Communication studies > Information theory
This book on complexity science comprises a collection of chapters on methods and principles from a wide variety of disciplinary fields - from physics and chemistry to biology and the social sciences.In this two-part volume, the first part is a collection of chapters introducing different aspects in a coherent fashion, and providing a common basis and the founding principles of the different complexity science approaches; the next provides deeper discussions of the different methods of use in complexity science, with interesting illustrative applications.The fundamental topics deal with self-organization, pattern formation, forecasting uncertainties, synchronization and revolutionary change, self-adapting and self-correcting systems, and complex networks. Examples are taken from biology, chemistry, engineering, epidemiology, robotics, economics, sociology, and neurology.
This book deals with the autoregressive method for digital processing of random oscillations. The method is based on a one-to-one transformation of the numeric factors of the Yule series model to linear elastic system characteristics. This parametric approach allowed to develop a formal processing procedure from the experimental data to obtain estimates of logarithmic decrement and natural frequency of random oscillations. A straightforward mathematical description of the procedure makes it possible to optimize a discretization of oscillation realizations providing efficient estimates. The derived analytical expressions for confidence intervals of estimates enable a priori evaluation of their accuracy. Experimental validation of the method is also provided. Statistical applications for the analysis of mechanical systems arise from the fact that the loads experienced by machineries and various structures often cannot be described by deterministic vibration theory. Therefore, a sufficient description of real oscillatory processes (vibrations) calls for the use of random functions. In engineering practice, the linear vibration theory (modeling phenomena by common linear differential equations) is generally used. This theory's fundamental concepts such as natural frequency, oscillation decrement, resonance, etc. are credited for its wide use in different technical tasks. In technical applications two types of research tasks exist: direct and inverse. The former allows to determine stochastic characteristics of the system output X(t) resulting from a random process E(t) when the object model is considered known. The direct task enables to evaluate the effect of an operational environment on the designed object and to predict its operation under various loads. The inverse task is aimed at evaluating the object model on known processes E(t) and X(t), i.e. finding model (equations) factors. This task is usually met at the tests of prototypes to identify (or verify) its model experimentally. To characterize random processes a notion of "shaping dynamic system" is commonly used. This concept allows to consider the observing process as the output of a hypothetical system with the input being stationary Gauss-distributed ("white") noise. Therefore, the process may be exhaustively described in terms of parameters of that system. In the case of random oscillations, the "shaping system" is an elastic system described by the common differential equation of the second order: X (t)+2hX (t)+ _0^2 X(t)=E(t), where 0 = 2 / 0 is the natural frequency, T0 is the oscillation period, and h is a damping factor. As a result, the process X(t) can be characterized in terms of the system parameters - natural frequency and logarithmic oscillations decrement = hT0 as well as the process variance. Evaluation of these parameters is subjected to experimental data processing based on frequency or time-domain representations of oscillations. It must be noted that a concept of these parameters evaluation did not change much during the last century. For instance, in case of the spectral density utilization, evaluation of the decrement values is linked with bandwidth measurements at the points of half-power of the observed oscillations. For a time-domain presentation, evaluation of the decrement requires measuring covariance values delayed by a time interval divisible by T0. Both estimation procedures are derived from a continuous description of research phenomena, so the accuracy of estimates is linked directly to the adequacy of discrete representation of random oscillations. This approach is similar a concept of transforming differential equations to difference ones with derivative approximation by corresponding finite differences. The resulting discrete model, being an approximation, features a methodical error which can be decreased but never eliminated. To render such a presentation more accurate it is imperative to decrease the discretization interval and to increase realization size growing requirements for computing power. The spectral density and covariance function estimates comprise a non-parametric (non-formal) approach. In principle, any non-formal approach is a kind of art i.e. the results depend on the performer's skills. Due to interference of subjective factors in spectral or covariance estimates of random signals, accuracy of results cannot be properly determined or justified. To avoid the abovementioned difficulties, the application of linear time-series models with well-developed procedures for parameter estimates is more advantageous. A method for the analysis of random oscillations using a parametric model corresponding discretely (no approximation error) with a linear elastic system is developed and presented in this book. As a result, a one-to-one transformation of the model's numerical factors to logarithmic decrement and natural frequency of random oscillations is established. It allowed to develop a formal processing procedure from experimental data to obtain the estimates of and 0. The proposed approach allows researchers to replace traditional subjective techniques by a formal processing procedure providing efficient estimates with analytically defined statistical uncertainties.
This multidisciplinary volume is the second in the STEAM-H series to feature invited contributions on mathematical applications in naval engineering. Seeking a more holistic approach that transcends current scientific boundaries, leading experts present interdisciplinary instruments and models on a broad range of topics. Each chapter places special emphasis on important methods, research directions, and applications of analysis within the field. Fundamental scientific and mathematical concepts are applied to topics such as microlattice materials in structural dynamics, acoustic transmission in low Mach number liquid flow, differential cavity ventilation on a symmetric airfoil, Kalman smoother, metallic foam metamaterials for vibration damping and isolation, seal whiskers as a bio-inspired model for the reduction of vortex-induced vibrations, multidimensional integral for multivariate weighted generalized Gaussian distributions, minimum uniform search track placement for rectangular regions, antennas in the maritime environment, the destabilizing impact of non-performers in multi-agent groups, inertial navigation accuracy with bias modeling. Carefully peer-reviewed and pedagogically presented for a broad readership, this volume is perfect to graduate and postdoctoral students interested in interdisciplinary research. Researchers in applied mathematics and sciences will find this book an important resource on the latest developments in naval engineering. In keeping with the ideals of the STEAM-H series, this volume will certainly inspire interdisciplinary understanding and collaboration.
The research presented in this book shows how combining deep neural networks with a special class of fuzzy logical rules and multi-criteria decision tools can make deep neural networks more interpretable - and even, in many cases, more efficient. Fuzzy logic together with multi-criteria decision-making tools provides very powerful tools for modeling human thinking. Based on their common theoretical basis, we propose a consistent framework for modeling human thinking by using the tools of all three fields: fuzzy logic, multi-criteria decision-making, and deep learning to help reduce the black-box nature of neural models; a challenge that is of vital importance to the whole research community.
This book provides awareness of methods used for functional encryption in the academic and professional communities. The book covers functional encryption algorithms and its modern applications in developing secure systems via entity authentication, message authentication, software security, cyber security, hardware security, Internet of Thing (IoT), cloud security, smart card technology, CAPTCHA, digital signature, and digital watermarking. This book is organized into fifteen chapters; topics include foundations of functional encryption, impact of group theory in cryptosystems, elliptic curve cryptography, XTR algorithm, pairing based cryptography, NTRU algorithms, ring units, cocks IBE schemes, Boneh-Franklin IBE, Sakai-Kasahara IBE, hierarchical identity based encryption, attribute based Encryption, extensions of IBE and related primitives, and digital signatures. Explains the latest functional encryption algorithms in a simple way with examples; Includes applications of functional encryption in information security, application security, and network security; Relevant to academics, research scholars, software developers, etc.
This book illustrates how modern mathematical wavelet transform techniques offer fresh insights into the complex behavior of neural systems at different levels: from the microscopic dynamics of individual cells to the macroscopic behavior of large neural networks. It also demonstrates how and where wavelet-based mathematical tools can provide an advantage over classical approaches used in neuroscience. The authors well describe single neuron and populational neural recordings. This 2nd edition discusses novel areas and significant advances resulting from experimental techniques and computational approaches developed since 2015, and includes three new topics: * Detection of fEPSPs in multielectrode LFPs recordings. * Analysis of Visual Sensory Processing in the Brain and BCI for Human Attention Control; * Analysis and Real-time Classification of Motor-related EEG Patterns; The book is a valuable resource for neurophysiologists and physicists familiar with nonlinear dynamical systems and data processing, as well as for graduate students specializing in these and related areas.
How to build and maintain strong data organizations--the Dummies way Data Governance For Dummies offers an accessible first step for decision makers into understanding how data governance works and how to apply it to an organization in a way that improves results and doesn't disrupt. Prep your organization to handle the data explosion (if you know, you know) and learn how to manage this valuable asset. Take full control of your organization's data with all the info and how-tos you need. This book walks you through making accurate data readily available and maintaining it in a secure environment. It serves as your step-by-step guide to extracting every ounce of value from your data. Identify the impact and value of data in your business Design governance programs that fit your organization Discover and adopt tools that measure performance and need Address data needs and build a more data-centric business culture This is the perfect handbook for professionals in the world of data analysis and business intelligence, plus the people who interact with data on a daily basis. And, as always, Dummies explains things in terms anyone can understand, making it easy to learn everything you need to know.
RDF-based knowledge graphs require additional formalisms to be fully context-aware, which is presented in this book. This book also provides a collection of provenance techniques and state-of-the-art metadata-enhanced, provenance-aware, knowledge graph-based representations across multiple application domains, in order to demonstrate how to combine graph-based data models and provenance representations. This is important to make statements authoritative, verifiable, and reproducible, such as in biomedical, pharmaceutical, and cybersecurity applications, where the data source and generator can be just as important as the data itself. Capturing provenance is critical to ensure sound experimental results and rigorously designed research studies for patient and drug safety, pathology reports, and medical evidence generation. Similarly, provenance is needed for cyberthreat intelligence dashboards and attack maps that aggregate and/or fuse heterogeneous data from disparate data sources to differentiate between unimportant online events and dangerous cyberattacks, which is demonstrated in this book. Without provenance, data reliability and trustworthiness might be limited, causing data reuse, trust, reproducibility and accountability issues. This book primarily targets researchers who utilize knowledge graphs in their methods and approaches (this includes researchers from a variety of domains, such as cybersecurity, eHealth, data science, Semantic Web, etc.). This book collects core facts for the state of the art in provenance approaches and techniques, complemented by a critical review of existing approaches. New research directions are also provided that combine data science and knowledge graphs, for an increasingly important research topic.
This book is useful to engineers, researchers, entrepreneurs, and students in different branches of production, engineering, and systems sciences. The polytopic roadmaps are the guidelines inspired by the development stages of cognitive-intelligent systems, and expected to become powerful instruments releasing an abundance of new capabilities and structures for complex engineering systems implementation. The 4D approach developed in previous monographs and correlated with industry 4.0and Fourth Industrial Revolution is continued here toward higher dimensions approaches correlated with polytopic operations, equipment, technologies, industries, and societies. Methodology emphasizes the role of doubling, iteration, dimensionality, and cyclicality around the center, of periodic tables and of conservative and exploratory strategies. Partitions, permutations, classifications, and complexification, as polytopic chemistry, are the elementary operations analyzed. Multi-scale transfer, cyclic operations, conveyors, and assembly lines are the practical examples of operations and equipment. Polytopic flow sheets, online analytical processing, polytopic engineering designs, and reality-inspired engineering are presented. Innovative concepts such as Industry 5.0, polytopic industry, Society 5.0, polytopic society, cyber physical social systems, industrial Internet, and digital twins have been discussed. The general polytopic roadmaps, (GPTR), are proposed as universal guidelines and as common methodologies to synthesize the systemic thinking and capabilities for growing complexity projects implementation.
This book presents the latest advances in computational intelligence and data analytics for sustainable future smart cities. It focuses on computational intelligence and data analytics to bring together the smart city and sustainable city endeavors. It also discusses new models, practical solutions and technological advances related to the development and the transformation of cities through machine intelligence and big data models and techniques. This book is helpful for students and researchers as well as practitioners.
This book highlights current research into virtual tutoring software and presents a case study of the design and application of a social tutor for children with autism. Best practice guidelines for developing software-based educational interventions are discussed, with a major emphasis on facilitating the generalisation of skills to contexts outside of the software itself, and on maintaining these skills over time. Further, the book presents the software solution Thinking Head Whiteboard, which provides a framework for families and educators to create unique educational activities utilising virtual character technology and customised to match learners' needs and interests. In turn, the book describes the development and evaluation of a social tutor incorporating multiple life-like virtual humans, leading to an exploration of the lessons learned and recommendations for the future development of related technologies.
The subject of the book is to present the modeling, parameter estimation and other aspects of the identification of nonlinear dynamic systems. The treatment is restricted to the input-output modeling approach. Because of the widespread usage of digital computers discrete time methods are preferred. Time domain parameter estimation methods are dealt with in detail, frequency domain and power spectrum procedures are described shortly. The theory is presented from the engineering point of view, and a large number of examples of case studies on the modeling and identifications of real processes illustrate the methods. Almost all processes are nonlinear if they are considered not merely in a small vicinity of the working point. To exploit industrial equipment as much as possible, mathematical models are needed which describe the global nonlinear behavior of the process. If the process is unknown, or if the describing equations are too complex, the structure and the parameters can be determined experimentally, which is the task of identification. The book is divided into seven chapters dealing with the following topics: 1. Nonlinear dynamic process models 2. Test signals for identification 3. Parameter estimation methods 4. Nonlinearity test methods 5. Structure identification 6. Model validity tests 7. Case studies on identification of real processes Chapter I summarizes the different model descriptions of nonlinear dynamical systems.
This book focuses on distributed and economic Model Predictive Control (MPC) with applications in different fields. MPC is one of the most successful advanced control methodologies due to the simplicity of the basic idea (measure the current state, predict and optimize the future behavior of the plant to determine an input signal, and repeat this procedure ad infinitum) and its capability to deal with constrained nonlinear multi-input multi-output systems. While the basic idea is simple, the rigorous analysis of the MPC closed loop can be quite involved. Here, distributed means that either the computation is distributed to meet real-time requirements for (very) large-scale systems or that distributed agents act autonomously while being coupled via the constraints and/or the control objective. In the latter case, communication is necessary to maintain feasibility or to recover system-wide optimal performance. The term economic refers to general control tasks and, thus, goes beyond the typically predominant control objective of set-point stabilization. Here, recently developed concepts like (strict) dissipativity of optimal control problems or turnpike properties play a crucial role. The book collects research and survey articles on recent ideas and it provides perspectives on current trends in nonlinear model predictive control. Indeed, the book is the outcome of a series of six workshops funded by the German Research Foundation (DFG) involving early-stage career scientists from different countries and from leading European industry stakeholders.
This book presents best selected research papers presented at Innovation in Sustainable Energy and Technology India (ISET 2020), organized by Energy Institute Bangalore (A unit of RGIPT, an Institute of National Importance), India, during 3-4 December 2020. The book covers various topics of sustainable energy and technologies which includes renewable energy (solar photovoltaic, solar thermal and CSP, biomass, wind energy, micro hydro power, hydrogen energy, geothermal energy, energy materials, energy storage, hybrid energy), smart energy systems (electrical vehicle, cybersecurity, charging infrastructures, IOT & AI, waste management, PHEV (CNG/EV) and mobility (smart grids, IOT & AI, energy-efficient buildings, mart agriculture).
The aim of this book is to furnish the reader with a rigorous and detailed exposition of the concept of control parametrization and time scaling transformation. It presents computational solution techniques for a special class of constrained optimal control problems as well as applications to some practical examples. The book may be considered an extension of the 1991 monograph A Unified Computational Approach Optimal Control Problems, by K.L. Teo, C.J. Goh, and K.H. Wong. This publication discusses the development of new theory and computational methods for solving various optimal control problems numerically and in a unified fashion. To keep the book accessible and uniform, it includes those results developed by the authors, their students, and their past and present collaborators. A brief review of methods that are not covered in this exposition, is also included. Knowledge gained from this book may inspire advancement of new techniques to solve complex problems that arise in the future. This book is intended as reference for researchers in mathematics, engineering, and other sciences, graduate students and practitioners who apply optimal control methods in their work. It may be appropriate reading material for a graduate level seminar or as a text for a course in optimal control.
This book describes concepts and tools needed for water resources management, including methods for modeling, simulation, optimization, big data analysis, data mining, remote sensing, geographical information system, game theory, conflict resolution, System dynamics, agent-based models, multiobjective, multicriteria, and multiattribute decision making and risk and uncertainty analysis, for better and sustainable management of water resources and consumption, thus mitigating the present and future global water shortage crisis. It presents the applications of these tools through case studies which demonstrate its benefits of proper management of water resources systems. This book acts as a reference for students, professors, industrial practitioners, and stakeholders in the field of water resources and hydrology.
This book includes the original, peer reviewed research articles from the 2nd International Conference on Cybernetics, Cognition and Machine Learning Applications (ICCCMLA 2020), held in August, 2020 at Goa, India. It covers the latest research trends or developments in areas of data science, artificial intelligence, neural networks, cognitive science and machine learning applications, cyber physical systems and cybernetics.
This book is a collection of selected papers presented at the International Conference on Mathematical Analysis and Computing (ICMAC 2019) held at Sri Sivasubramaniya Nadar College of Engineering, Chennai, India, from 23-24 December 2019. Having found its applications in game theory, economics, and operations research, mathematical analysis plays an important role in analyzing models of physical systems and provides a sound logical base for problems stated in a qualitative manner. This book aims at disseminating recent advances in areas of mathematical analysis, soft computing, approximation and optimization through original research articles and expository survey papers. This book will be of value to research scholars, professors, and industrialists working in these areas.
The past few years have seen the attention and rapid developments in event-triggered sampled-data systems, in which the effect of event-triggered sensor measurements and controller updates is explored in controller analysis and design. This book offers the first systematic treatment of event-triggered sampled-data control system design using active disturbance rejection control (ADRC), an effective approach that is popular in both theoretic research and industrial applications. Extensive application examples with numerous illustrations are included to show how the event-triggered ADRC with theoretic performance guarantees can be implemented in engineering systems and how the performance can be actually achieved. For theoretic researchers and graduate students, the presented results provide new directions in theoretic research on event-triggered sampled-data systems; for control practitioners, the book offers an effective approach to achieving satisfactory performance with limited sampling rates.
This book provides novel approach to the diagnosis of complex technical systems that are widely used in various kinds of transportation, energy, metallurgy, metalworking, fuels, mining, chemical, paper industries, etc. Effective diagnostic systems are necessary for the early detection of errors in mechatronic systems, for the organization of maintenance and for the assessment of the performed service quality. Unfortunately, the practical use of AI in the diagnosis of mechatronic systems is still quite limited and the inability to build effective mechatronic systems leads to significant economic losses and dangers. The main aim of this book is to contribute to knowledge within the topic of diagnostics of mechatronic systems by the analysis of the elements reliability characteristics, using methods, models and algorithms for diagnostics and by studying examples of model diagnostic systems using AI methods based on neural networks, fuzzy inference systems and genetic algorithms.
This book constitutes the refereed post-conference proceedings of the 4th International Conference on Intelligence Science, ICIS 2020, held in Durgapur, India, in February 2021 (originally November 2020). The 23 full papers and 4 short papers presented were carefully reviewed and selected from 42 submissions. One extended abstract is also included. They deal with key issues in brain cognition; uncertain theory; machine learning; data intelligence; language cognition; vision cognition; perceptual intelligence; intelligent robot; and medical artificial intelligence.
The last two decades have seen the development of a number of
models that have proven particularly important in advancing
understanding of message-production processes. Now it appears that
a "second generation" of theories is emerging, one that reflects
considerable conceptual advances over earlier models. "Message
Production: Advances in Communication Theory" focuses on these new
developments in theoretical approaches to verbal and nonverbal
message production. The chapters reflect a number of
characteristics and trends resident in these theories including:
This book explores the genesis of ransomware and how the parallel emergence of encryption technologies has elevated ransomware to become the most prodigious cyber threat that enterprises are confronting. It also investigates the driving forces behind what has been dubbed the 'ransomware revolution' after a series of major attacks beginning in 2013, and how the advent of cryptocurrencies provided the catalyst for the development and increased profitability of ransomware, sparking a phenomenal rise in the number and complexity of ransomware attacks. This book analyzes why the speed of technology adoption has been a fundamental factor in the continued success of financially motivated cybercrime, and how the ease of public access to advanced encryption techniques has allowed malicious actors to continue to operate with increased anonymity across the internet. This anonymity has enabled increased collaboration between attackers, which has aided the development of new ransomware attacks, and led to an increasing level of technical complexity in ransomware attacks. This book highlights that the continuous expansion and early adoption of emerging technologies may be beyond the capacity of conventional risk managers and risk management frameworks. Researchers and advanced level students studying or working in computer science, business or criminology will find this book useful as a reference or secondary text. Professionals working in cybersecurity, cryptography, information technology, financial crime (and other related topics) will also welcome this book as a reference.
This book covers some selected problems of the descriptor integer and fractional order positive continuous-time and discrete-time systems. The book consists of 3 chapters, 4 appendices and the list of references. Chapter 1 is devoted to descriptor integer order continuous-time and discrete-time linear systems. In Chapter 2, descriptor fractional order continuous-time and discrete-time linear systems are considered. Chapter 3 is devoted to the stability of descriptor continuous-time and discrete-time systems of integer and fractional orders. In Appendix A, extensions of the Cayley-Hamilton theorem for descriptor linear systems are given. Some methods for computation of the Drazin inverse are presented in Appendix B. In Appendix C, some basic definitions and theorems on Laplace transforms and Z-transforms are given. Some properties of the nilpotent matrices are given in Appendix D.
This proceedings book presents state-of-the-art developments in theory, methodology, and applications of network analysis across sociology, computational science, education research, literature studies, political science, international relations, social media research, and urban studies. The papers comprising this collection were presented at the Fifth 'Networks in the Global World' conference organized by the Centre for German and European Studies of St. Petersburg University and Bielefeld University and held on July 7-9, 2020. This biannual conference series revolves around key interdisciplinary issues in the focus of network analysts, such as the multidimensional approach to social reality, translation of theories and methods across disciplines, and mixing of data and methods. The distinctive features of this book are the emphasis on in-depth linkages between theory, method, and applications, the blend of qualitative and quantitative methods, and the joint consideration of different network levels, types, and contexts. The topics covered by the papers include interrelation of social and cultural structures, constellations of power, and patterns of interaction in areas ranging from various types of communities (local, international, educational, political, and so on) to social media and literature. The book is useful for practicing researchers, graduate and postgraduate students, and educators interested in network analysis of social relations, politics, economy, and culture. Features that set the book apart from others in the field: * The book offers a unique cross-disciplinary blend of computational and ethnographic network analyses applied to a diverse spectrum of spheres, from literature and education to urban planning and policymaking. * Embracing conceptual, methodological, and empirical works, the book is among the few in network analysis to emphasize connections between theory, method, and applications. * The book brings together authors and empirical contexts from all over the globe, with a particular emphasis on European societies. |
![]() ![]() You may like...
Handbook of Research on Applied…
Snehanshu Saha, Abhyuday Mandal, …
Hardcover
R6,758
Discovery Miles 67 580
Machine Learning for Cyber Physical…
Oliver Niggemann, Christian Kuhnert, …
Hardcover
R1,357
Discovery Miles 13 570
Binary Bullets - The Ethics of…
Fritz Allhoff, Adam Henschke, …
Hardcover
R3,698
Discovery Miles 36 980
|