![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Information theory
We are confronted with problems in virtually all of our major systems. Einstein said that "we cannot solve our problems on the same level of thinking we were on when we created them." We believe a fundamental fault underlying all these problems is the way we look at them. Our traditional method has been to analyze the system as if it were a machine to find the faulty elements and to fix them. We have done this with ailing ecosystems long enough to know that it doesn't work well. If it doesn't work here because of the complexity of the system how can we expect it to work on us or other equally, if not more, complex living systems? We are not machines. We can adapt and create in novel ways. In The Boids and the Bees we, and other living systems, are seen as the complex and adaptive systems that we (they) are, which leads to a perceptual revolution: - We are fighting a war with bacteria that we can't win. Seeing bacteria as adapting agents allows us to see how they adapt and opens other doors to end the war; - Patients that are informed and empowered can lead our health care system to focus on prevention and health rather than illness and profits; - Learning has been analyzed in the laboratory. Now we use the results of this analysis to teach our children; they become the lab-rats in the classroom. Seeing them as adaptive agents is the first step in correcting this dehumanizing error. How we adapt today will determine our tomorrows; and they can be optimized.
In June of 2002, over 500 professors, students and researchers met in Boston, Massachusetts for the Fourth International Conference on Complex Systems. The attendees represented a remarkably diverse collection of fields: biology, ecology, physics, engineering, computer science, economics, psychology and sociology, The goal of the conference was to encourage cross-fertilization between the many disciplines represented and to deepen understanding of the properties common to all complex systems. This volume contains 43 papers selected from the more than 200 presented at the conference. Topics include: cellular automata, neurology, evolution, computer science, network dynamics, and urban planning. About NECSI: For over 10 years, The New England Complex Systems Institute (NECSI) has been instrumental in the development of complex systems science and its applications. NECSI conducts research, education, knowledge dissemination, and community development around the world for the promotion of the study of complex systems and its application for the betterment of society. NECSI hosts the International Conference on Complex Systems and publishes the NECSI Book Series in conjunction with Springer Publishers. ALI MINAI is an Affiliate of the New England Complex Systems Institute and an Associate Professor in the Department of Electrical and Computer Engineering and Computer Science at the University of Cincinnati. YANEER BAR-YAM is President and founder of the New England Complex Systems Institute. He is the author of Dynamics of Complex Systems and Making Things Work: Solving Complex Problems in a Complex World.
Experimental Econophysics describes the method of controlled human experiments, which is developed by physicists to study some problems in economics or finance, namely, stylized facts, fluctuation phenomena, herd behavior, contrarian behavior, hedge behavior, cooperation, business cycles, partial information, risk management, and stock prediction. Experimental econophysics together with empirical econophysics are two branches of the field of econophysics. The latter one has been extensively discussed in the existing books, while the former one has been seldom touched. In this book, the author will focus on the branch of experimental econophysics. Empirical econophysics is based on the analysis of data in real markets by using some statistical tools borrowed from traditional statistical physics. Differently, inspired by the role of controlled experiments and system modelling (for computer simulations and/or analytical theory) in developing modern physics, experimental econophysics specially relies on controlled human experiments in the laboratory (producing data for analysis) together with agent-based modelling (for computer simulations and/or analytical theory), with an aim at revealing the general cause-effect relationship between specific parameters and emergent properties of real economic/financial markets. This book covers the basic concepts, experimental methods, modelling approaches, and latest progress in the field of experimental econophysics.
Intelligent technical systems, which combine mechanical, electrical and software engineering with control engineering and advanced mathematics, go far beyond the state of the art in mechatronics and open up fascinating perspectives. Among these systems are so-called self-optimizing systems, which are able to adapt their behavior autonomously and flexibly to changing operating conditions. Self-optimizing systems create high value for example in terms of energy and resource efficiency as well as reliability. The Collaborative Research Center 614 "Self-optimizing Concepts and Structures in Mechanical Engineering" pursued the long-term aim to open up the active paradigm of self-optimization for mechanical engineering and to enable others to develop self-optimizing systems. This book is directed to researchers and practitioners alike. It provides a design methodology for the development of self-optimizing systems consisting of a reference process, methods, and tools. The reference process is divided into two phases the domain-spanning conceptual design and the domain-specific design and development. For the conceptual design a holistic approach is provided. Domain-specific methods and tools developed especially for the design and development of self-optimizing systems are described and illustrated by application examples. This book will enable the reader to identify the potential for self-optimization and to develop self-optimizing systems independently."
The book covers nonlinear physical problems and mathematical modeling, including molecular biology, genetics, neurosciences, artificial intelligence with classical problems in mechanics and astronomy and physics. The chapters present nonlinear mathematical modeling in life science and physics through nonlinear differential equations, nonlinear discrete equations and hybrid equations. Such modeling can be effectively applied to the wide spectrum of nonlinear physical problems, including the KAM (Kolmogorov-Arnold-Moser (KAM)) theory, singular differential equations, impulsive dichotomous linear systems, analytical bifurcation trees of periodic motions, and almost or pseudo- almost periodic solutions in nonlinear dynamical systems.
This book develops applications of novel generalizations of fuzzy information measures in the field of pattern recognition, medical diagnosis, multi-criteria and multi-attribute decision making and suitability in linguistic variables. The focus of this presentation lies on introducing consistently strong and efficient generalizations of information and information-theoretic divergence measures in fuzzy and intuitionistic fuzzy environment covering different practical examples. The target audience comprises primarily researchers and practitioners in the involved fields but the book may also be beneficial for graduate students.
The systems movement is made up of many systems societies as well as of disciplinary researchers and researches, explicitly or implicitly focusing on the subject of systemics, officially introduced in the scientific community fifty years ago. Many researches in different fields have been and continue to be sources of new ideas and challenges for the systems community. To this regard, a very important topic is the one of EMERGENCE. Between the goals for the actual and future systems scientists there is certainly the definition of a general theory of emergence and the building of a general model of it. The Italian Systems Society, Associazione Italiana per la Ricerca sui Sistemi (AIRS), decided to devote its Second National Conference to this subject. Because AIRS is organized under the form of a network of researchers, institutions, scholars, professionals, and teachers, its research activity has an impact at different levels and in different ways. Thus the topic of emergence was not only the focus of this conference but it is actually the main subject of many AIRS activities.
This volume contains the lectures presented at the workshop on "Advances in Mathematical Systems Theory," held on the island of Borkum, Germany (April 20-23, 1999). The book will be of interest to graduate students and researchers interested in control theory and mathematical systems theory, who will find in-depth analysis and presentations from diverse perspectives interacting in this lively area. The editors are proud to dedicate this volume to Diederich Hinrichsen on the occasion of his 60th birthday in acknowl- edgment of his major contributions to linear systems theory and control theory and his long-term achievements in establishing mathematical sys- tems theory in Germany. We all owe much to him as a teacher, colleague, and friend. The editors thank the Graduiertenkolleg "Komplexe Dynamische Sys- teme" at the University of Bremen as well as the European "Nonlinear Control Network" for providing financial support that enabled this work- shop. Augsburg, Germany Fritz Colonius Wiirzburg, Germany Uwe Helmke Kaiserslautern, Germany Dieter Pratzel-Wolters Bremen, Germany Fabian Wirth Introduction The workshop "Advances in Mathematical Systems Theory" took place in honor of Diederich Hinrichsen on the occasion of his 60th birthday. The following chapters are based on invited lectures and cover a wide range of topics in linear and nonlinear systems theory including parameteriza- tion problems, behaviors of linear systems and convolutional codes, as well as complementarity systems and hybrid systems.
This book reveals how open innovation utilizes the developing circle of business models to establish new ones that define a unique link between technology and markets, focusing on how to develop and maintain successful business models. It draws readers into the philosophy and economic effects of open innovation from the outset.It presents four different developing circle business models for customers in the role of consumers, entrepreneurs, social entrepreneurs and engineers respectively, enabling each group to develop, utilize and enlarge creative business models, and even switch business models.In addition to these four circles, it takes a systemic approach to describe the relationship between technology and markets. From this relationship an open innovation strategy towards entrepreneurship can be adopted. From Open Innovation to a Creative Developing-Circle Business Model is an essential resource for start-up entrepreneurs, as well as for students of technology management, strategy and open innovation.
"Intelligent Control" considers non-traditional modelling and control approaches to nonlinear systems. Fuzzy logic, neural networks and evolutionary computing techniques are the main tools used. The book presents a modular switching fuzzy logic controller where a PD-type fuzzy controller is executed first followed by a PI-type fuzzy controller thus improving the performance of the controller compared with a PID-type fuzzy controller.The advantage of the switching-type fuzzy controller is that it uses one rule-base thus minimises the rule-base during execution. A single rule-base is developed by merging the membership functions for change of error of the PD-type controller and sum of error of the PI-type controller. Membership functions are then optimized using evolutionary algorithms. Since the two fuzzy controllers were executed in series, necessary further tuning of the differential and integral scaling factors of the controller is then performed. Neural-network-based tuning for the scaling parameters of the fuzzy controller is then described and finally an evolutionary algorithm is applied to the neurally-tuned-fuzzy controller in which the sigmoidal function shape of the neural network is determined. The important issue of stability is addressed and the text demonstrates empirically that the developed controller was stable within the operating range. The text concludes with ideas for future research to show the reader the potential for further study in this area. "Intelligent Control "will be of interest to researchers from engineering and computer science backgrounds working in the intelligent and adaptive control."
One criterion for classifying books is whether they are written for a single pur pose or for multiple purposes. This book belongs to the category of multipurpose books, but one of its roles is predominant-it is primarily a textbook. As such, it can be used for a variety ofcourses at the first-year graduate or upper-division undergraduate level. A common characteristic of these courses is that they cover fundamental systems concepts, major categories of systems problems, and some selected methods for dealing with these problems at a rather general level. A unique feature of the book is that the concepts, problems, and methods are introduced in the context of an architectural formulation of an expert system referred to as the general systems problem solver or aSPS-whose aim is to provide users ofall kinds with computer-based systems knowledge and methodo logy. Theasps architecture, which is developed throughout the book, facilitates a framework that is conducive to acoherent, comprehensive, and pragmaticcoverage ofsystems fundamentals-concepts, problems, and methods. A course that covers systems fundamentals is now offered not only in sys tems science, information science, or systems engineering programs, but in many programs in other disciplines as well. Although the level ofcoverage for systems science or engineering students is surely different from that used for students in other disciplines, this book is designed to serve both of these needs."
Most machine learning research has been concerned with the development of systems that implememnt one type of inference within a single representational paradigm. Such systems, which can be called monostrategy learning systems, include those for empirical induction of decision trees or rules, explanation-based generalization, neural net learning from examples, genetic algorithm-based learning, and others. Monostrategy learning systems can be very effective and useful if learning problems to which they are applied are sufficiently narrowly defined. Many real-world applications, however, pose learning problems that go beyond the capability of monostrategy learning methods. In view of this, recent years have witnessed a growing interest in developing multistrategy systems, which integrate two or more inference types and/or paradigms within one learning system. Such multistrategy systems take advantage of the complementarity of different inference types or representational mechanisms. Therefore, they have a potential to be more versatile and more powerful than monostrategy systems. On the other hand, due to their greater complexity, their development is significantly more difficult and represents a new great challenge to the machine learning community. Multistrategy Learning contains contributions characteristic of the current research in this area.
This book focuses on information geometry manifolds of structured data/information and their advanced applications featuring new and fruitful interactions between several branches of science: information science, mathematics and physics. It addresses interrelations between different mathematical domains like shape spaces, probability/optimization & algorithms on manifolds, relational and discrete metric spaces, computational and Hessian information geometry, algebraic/infinite dimensional/Banach information manifolds, divergence geometry, tensor-valued morphology, optimal transport theory, manifold & topology learning, and applications like geometries of audio-processing, inverse problems and signal processing. The book collects the most important contributions to the conference GSI'2017 - Geometric Science of Information.
High performance computing consumes and generates vast amounts of data, and the storage, retrieval, and transmission of this data are major obstacles to effective use of computing power. Challenges inherent in all of these operations are security, speed, reliability, authentication and reproducibility. This workshop focused on a wide variety of technical results aimed at meeting these challenges. Topics ranging from the mathematics of coding theory to the practicalities of copyright preservation for Internet resources drew spirited discussion and interaction among experts in diverse but related fields. We hope this volume contributes to continuing this dialogue.
Whether costs are to be reduced, profits to be maximized, or scarce resources to be used wisely, optimization methods are available to guide decision making. In online optimization the main issue is incomplete data, and the scientific challenge: How well can an online algorithm perform? Can one guarantee solution quality, even without knowing all data in advance? In real-time optimization there is an additional requirement, decisions have to be computed very fast in relation to the time frame of the instance we consider. Online and real-time optimization problems occur in all branches of optimization. These areas have developed their own techniques but they are addressing the same issues: quality, stability, and robustness of the solutions. To fertilize this emerging topic of optimization theory and to foster cooperation between the different branches of optimization, the Deutsche Forschungsgemeinschaft (DFG) has supported a Priority Programme "Online Optimization of Large Systems".
This book focuses on new and emerging data mining solutions that offer a greater level of transparency than existing solutions. Transparent data mining solutions with desirable properties (e.g. effective, fully automatic, scalable) are covered in the book. Experimental findings of transparent solutions are tailored to different domain experts, and experimental metrics for evaluating algorithmic transparency are presented. The book also discusses societal effects of black box vs. transparent approaches to data mining, as well as real-world use cases for these approaches.As algorithms increasingly support different aspects of modern life, a greater level of transparency is sorely needed, not least because discrimination and biases have to be avoided. With contributions from domain experts, this book provides an overview of an emerging area of data mining that has profound societal consequences, and provides the technical background to for readers to contribute to the field or to put existing approaches to practical use.
New information technologies enable us to interact with each other in totally new ways. The Interaction Society: Theories, Practice and Supportive Technologies provides readers with a rich overview of the emerging interaction society enabled by these new information and communication technologies (ICT). Readers will gain a theoretically deep understanding of the core issues related to the character of the emerging interaction society, be exposed to empirical case studies that can help to understand the impact of this emergence through analysis of concrete examples, and benefit from descriptions of concrete design projects aimed at designing new novel information technologies to support activities in the interaction society.
This volume constitutes a comprehensive self-contained course on source encoding. This is a rapidly developing field and the purpose of this book is to present the theory from its beginnings to the latest developments, some of which appear in book form for the first time. The major differences between this volume and previously published works is that here information retrieval is incorporated into source coding instead of discussing it separately. Second, this volume places an emphasis on the trade-off between complexity and the quality of coding; i.e. what is the price of achieving a maximum degree of data compression? Third, special attention is paid to universal families which contain a good compressing map for every source in a set. The volume presents a new algorithm for retrieval, which is optimal with respect to both program length and running time, and algorithms for hashing and adaptive on-line compressing. All the main tools of source coding and data compression such as Shannon, Ziv--Lempel, Gilbert--Moore codes, Kolmogorov complexity epsilon-entropy, lexicographic and digital search, are discussed. Moreover, data compression methods are described for developing short programs for partially specified Boolean functions, short formulas for threshold functions, identification keys, stochastic algorithms for finding the occurrence of a word in a text, and T-independent sets. For researchers and graduate students of information theory and theoretical computer science. The book will also serve as a useful reference for communication engineers and database designers.
The idea about this book has evolved during the process of its preparation as some of the results have been achieved in parallel with its writing. One reason for this is that in this area of research results are very quickly updated. Another is, possibly, that a strong, unchallenged theoretical basis in this field still does not fully exist. From other hand, the rate of innovation, competition and demand from different branches of industry (from biotech industry to civil and building engineering, from market forecasting to civil aviation, from robotics to emerging e-commerce) is increasingly pressing for more customised solutions based on learning consumers behaviour. A highly interdisciplinary and rapidly innovating field is forming which focus is the design of intelligent, self-adapting systems and machines. It is on the crossroads of control theory, artificial and computational intelligence, different engineering disciplines borrowing heavily from the biology and life sciences. It is often called intelligent control, soft computing or intelligent technology. Some other branches have appeared recently like intelligent agents (which migrated from robotics to different engineering fields), data fusion, knowledge extraction etc., which are inherently related to this field. The core is the attempts to enhance the abilities of the classical control theory in order to have more adequate, flexible, and adaptive models and control algorithms.
The book presents findings, views and ideas on what exact problems of image processing, pattern recognition and generation can be efficiently solved by cellular automata architectures. This volume provides a convenient collection in this area, in which publications are otherwise widely scattered throughout the literature. The topics covered include image compression and resizing; skeletonization, erosion and dilation; convex hull computation, edge detection and segmentation; forgery detection and content based retrieval; and pattern generation. The book advances the theory of image processing, pattern recognition and generation as well as the design of efficient algorithms and hardware for parallel image processing and analysis. It is aimed at computer scientists, software programmers, electronic engineers, mathematicians and physicists, and at everyone who studies or develops cellular automaton algorithms and tools for image processing and analysis, or develops novel architectures and implementations of massive parallel computing devices. The book will provide attractive reading for a general audience because it has do-it-yourself appeal: all the computer experiments presented within it can be implemented with minimal knowledge of programming. The simplicity yet substantial functionality of the cellular automaton approach, and the transparency of the algorithms proposed, makes the text ideal supplementary reading for courses on image processing, parallel computing, automata theory and applications."
Information theory is an exceptional field in many ways. Technically, it is one of the rare fields in which mathematical results and insights have led directly to significant engineering payoffs. Professionally, it is a field that has sustained a remarkable degree of community, collegiality and high standards. James L. Massey, whose work in the field is honored here, embodies the highest standards of the profession in his own career. The book covers the latest work on: block coding, convolutional coding, cryptography, and information theory. The 44 contributions represent a cross-section of the world's leading scholars, scientists and researchers in information theory and communication. The book is rounded off with an index and a bibliography of publications by James Massey.
|
You may like...
Advocacy for Social and Linguistic…
Christine E. Poteau, Carter A. Winkle
Hardcover
R4,489
Discovery Miles 44 890
Investing in Early Childhood Development…
A. Tarlov, M. Debbink
Hardcover
True Partnerships in SEND - Working…
Heather Green, Becky Edwards
Paperback
R885
Discovery Miles 8 850
Professionalism in Early Childhood…
Carmen Dalli, Mathias Urban
Paperback
R1,683
Discovery Miles 16 830
Techniques for Teaching Young Children…
Glenda MacNaughton, Gillian Williams
Paperback
R2,083
Discovery Miles 20 830
A History of Holistic Literacy - Five…
M.P. Cavanaugh
Hardcover
Professionalism in Early Childhood…
Carmen Dalli, Mathias Urban
Hardcover
R4,349
Discovery Miles 43 490
Contemporary Perspectives on Play in…
Olivia Natividad Saracho, Olivia N. Saracho
Hardcover
R2,553
Discovery Miles 25 530
|