Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Reference & Interdisciplinary > Communication studies > Information theory
This book opens a novel dimension in the 50 year history of mathematical theory of "information" since the birth of Shannon theory. First of all, it introduces, in place of the traditional notion of entropy and mutual information, the completely new and highly unconventional approach of "information-spectrum" as a basic but powerful tool for constructing the general theory of information. Reconstructing step-by-step all the essential major topics in information theory from the viewpoint of such an "information-spectrum", this comprehensive work provides an accessible introduction to the new type of mathematical theory of information that focuses mainly on general nonstationary and /or nonergodic sources and channels, in clear contrast with the traditional theories of information. This book is a new non-traditional theoretical reference for communication professionals and statisticians specializing in information theory.
This monograph focuses on those stochastic quickest detection tasks in disorder problems that arise in the dynamical analysis of statistical data. These include quickest detection of randomly appearing targets, of spontaneously arising effects, and of arbitrage (in financial mathematics). There is also currently great interest in quickest detection methods for randomly occurring intrusions in information systems and in the design of defense methods against cyber-attacks. The author shows that the majority of quickest detection problems can be reformulated as optimal stopping problems where the stopping time is the moment the occurrence of disorder is signaled. Thus, considerable attention is devoted to the general theory of optimal stopping rules, and to its concrete problem-solving methods. The exposition covers both the discrete time case, which is in principle relatively simple and allows step-by-step considerations, and the continuous-time case, which often requires more technical machinery such as martingales, supermartingales, and stochastic integrals. There is a focus on the well-developed apparatus of Brownian motion, which enables the exact solution of many problems. The last chapter presents applications to financial markets. Researchers and graduate students interested in probability, decision theory and statistical sequential analysis will find this book useful.
Originally published in 1995, Large Deviations for Performance Analysis consists of two synergistic parts. The first half develops the theory of large deviations from the beginning, through recent results on the theory for processes with boundaries, keeping to a very narrow path: continuous-time, discrete-state processes. By developing only what is needed for the applications, the theory is kept to a manageable level, both in terms of length and in terms of difficulty. Within its scope, the treatment is detailed, comprehensive and self-contained. As the book shows, there are sufficiently many interesting applications of jump Markov processes to warrant a special treatment. The second half is a collection of applications developed at Bell Laboratories. The applications cover large areas of the theory of communication networks: circuit switched transmission, packet transmission, multiple access channels, and the M/M/1 queue. Aspects of parallel computation are covered as well including, basics of job allocation, rollback-based parallel simulation, assorted priority queueing models that might be used in performance models of various computer architectures, and asymptotic coupling of processors. These applications are thoroughly analysed using the tools developed in the first half of the book.
The Science of Deep Learning emerged from courses taught by the author that have provided thousands of students with training and experience for their academic studies, and prepared them for careers in deep learning, machine learning, and artificial intelligence in top companies in industry and academia. The book begins by covering the foundations of deep learning, followed by key deep learning architectures. Subsequent parts on generative models and reinforcement learning may be used as part of a deep learning course or as part of a course on each topic. The book includes state-of-the-art topics such as Transformers, graph neural networks, variational autoencoders, and deep reinforcement learning, with a broad range of applications. The appendices provide equations for computing gradients in backpropagation and optimization, and best practices in scientific writing and reviewing. The text presents an up-to-date guide to the field built upon clear visualizations using a unified notation and equations, lowering the barrier to entry for the reader. The accompanying website provides complementary code and hundreds of exercises with solutions.
Complex system studies are a growing area of central importance to a wide range of disciplines, ranging from physics to politics and beyond. Adopting this interdisciplinary approach, Systems, Self-Organisation and Information presents and discusses a range of ground-breaking research in complex systems theory. Building upon foundational concepts, the volume introduces a theory of Self-Organization, providing definitions of concepts including system, structure, organization, functionality, and boundary. Biophysical and cognitive approaches to Self-Organization are also covered, discussing the complex dynamics of living beings and the brain, and self-organized adaptation and learning in computational systems. The convergence of Peircean philosophy with the study of Self-Organization also provides an original pathway of research, which contributes to a dialogue between pragmatism, semeiotics, complexity theory, and self-organizing systems. As one of the few interdisciplinary works on systems theory, relating Self-Organization and Information Theory, Systems, Self-Organisation and Information is an invaluable resource for researchers and postgraduate students interested in complex systems theory from related disciplines including philosophy, physics, and engineering.
Complex system studies are a growing area of central importance to a wide range of disciplines, ranging from physics to politics and beyond. Adopting this interdisciplinary approach, Systems, Self-Organisation and Information presents and discusses a range of ground-breaking research in complex systems theory. Building upon foundational concepts, the volume introduces a theory of Self-Organization, providing definitions of concepts including system, structure, organization, functionality, and boundary. Biophysical and cognitive approaches to Self-Organization are also covered, discussing the complex dynamics of living beings and the brain, and self-organized adaptation and learning in computational systems. The convergence of Peircean philosophy with the study of Self-Organization also provides an original pathway of research, which contributes to a dialogue between pragmatism, semeiotics, complexity theory, and self-organizing systems. As one of the few interdisciplinary works on systems theory, relating Self-Organization and Information Theory, Systems, Self-Organisation and Information is an invaluable resource for researchers and postgraduate students interested in complex systems theory from related disciplines including philosophy, physics, and engineering.
Our world and the people within it are increasingly interpreted and classified by automated systems. At the same time, automated classifications influence what happens in the physical world. These entanglements change what it means to interact with governance, and shift what elements of our identity are knowable and meaningful. In this cyber-physical world, or 'world state', what is the role for law? Specifically, how should law address the claim that computational systems know us better than we know ourselves? Monitoring Laws traces the history of government profiling from the invention of photography through to emerging applications of computer vision for personality and behavioral analysis. It asks what dimensions of profiling have provoked legal intervention in the past, and what is different about contemporary profiling that requires updating our legal tools. This work should be read by anyone interested in how computation is changing society and governance, and what it is about people that law should protect in a computational world.
The volume that you have before you is the result of a growing realization that fluctuations in nonequilibrium systems playa much more important role than was 1 first believed. It has become clear that in nonequilibrium systems noise plays an active, one might even say a creative, role in processes involving self-organization, pattern formation, and coherence, as well as in biological information processing, energy transduction, and functionality. Now is not the time for a comprehensive summary of these new ideas, and I am certainly not the person to attempt such a thing. Rather, this short introductory essay (and the book as a whole) is an attempt to describe where we are at present and how the viewpoint that has evolved in the last decade or so differs from those of past decades. Fluctuations arise either because of the coupling of a particular system to an ex ternal unknown or "unknowable" system or because the particular description we are using is only a coarse-grained description which on some level is an approxima tion. We describe the unpredictable and random deviations from our deterministic equations of motion as noise or fluctuations. A nonequilibrium system is one in which there is a net flow of energy. There are, as I see it, four basic levels of sophistication, or paradigms, con cerning fluctuations in nature. At the lowest level of sophistication, there is an implicit assumption that noise is negligible: the deterministic paradigm."
The Soft Machine, originally published in 1985, represents a significant contribution to the study of contemporary literature in the larger cultural and scientific context. David Porush shows how the concepts of cybernetics and artificial intelligence that have sparked our present revolution in computer and information technology have also become the source for images and techniques in our most highly sophisticated literature, postmodern fiction by Barthelme, Barth, Pynchon, Beckett, Burroughs, Vonnegut and others. With considerable skill, Porush traces the growth of "the metaphor of the machine" as it evolves both technologically and in literature of the twentieth century. He describes the birth of cybernetics, gives one of the clearest accounts for a lay audience of its major concepts and shows the growth of philosophical resistance to the mechanical model for human intelligence and communication which cybernetics promotes, a model that had grown increasingly influential in the previous decade. The Soft Machine shows postmodern fiction synthesizing the inviting metaphors and concepts of cybernetics with the ideals of art, a synthesis that results in what Porush calls "cybernetic fiction" alive to the myths and images of a cybernetic age.
While the possible depletion of energy sources has been emphasized in most literatures, this book aims to show that the increase of entropy in the biosphere, resulting since the dawn of industrial era, is a cause for urgent concern.As the entropy release puts a limit on sustainable growth, and the CO2 atmospheric content is a reliable indicator of global entropy release that threatens the biospheric balance, a change of paradigm is necessary with the need to switch from an economy of exploitation to an economy of entropy.
This is the first full-length study about the British artist Roy Ascott, one of the first cybernetic artists, with a career spanning seven decades to date. The book focuses on his early career, exploring the evolution of his early interests in communication in the context of the rich overlaps between art, science and engineering in Britain during the 1950s and 1960s. The first part of the book looks at Ascott's training and early work. The second park looks solely at Groundcourse, Ascott's extraordinary pedagogical model for visual arts and cybernetics which used an integrative and systems-based model, drawing in behaviourism, analogue machines, performance and games. Using hitherto unpublished photographs and documents, this book will establish a more prominent place for cybernetics in post-war British art.
This multidisciplinary volume is the second in the STEAM-H series to feature invited contributions on mathematical applications in naval engineering. Seeking a more holistic approach that transcends current scientific boundaries, leading experts present interdisciplinary instruments and models on a broad range of topics. Each chapter places special emphasis on important methods, research directions, and applications of analysis within the field. Fundamental scientific and mathematical concepts are applied to topics such as microlattice materials in structural dynamics, acoustic transmission in low Mach number liquid flow, differential cavity ventilation on a symmetric airfoil, Kalman smoother, metallic foam metamaterials for vibration damping and isolation, seal whiskers as a bio-inspired model for the reduction of vortex-induced vibrations, multidimensional integral for multivariate weighted generalized Gaussian distributions, minimum uniform search track placement for rectangular regions, antennas in the maritime environment, the destabilizing impact of non-performers in multi-agent groups, inertial navigation accuracy with bias modeling. Carefully peer-reviewed and pedagogically presented for a broad readership, this volume is perfect to graduate and postdoctoral students interested in interdisciplinary research. Researchers in applied mathematics and sciences will find this book an important resource on the latest developments in naval engineering. In keeping with the ideals of the STEAM-H series, this volume will certainly inspire interdisciplinary understanding and collaboration.
In The State of State Theory: State Projects, Repression, and Multi-Sites of Power, Glasberg, Willis, and Shannon argue that state theories should be amended to account both for theoretical developments broadly in the contemporary period as well as the multiple sites of power along which the state governs. Using state projects and policies around political economy, sexuality and family, food, welfare policy, racial formation, and social movements as narrative accounts in how the state operates, the authors argue for a complex and intersectional approach to state theory. In doing so, they expand outside of the canon to engage with perspectives within critical race theory, queer theory, and beyond to build theoretical tools for a contemporary and critical state theory capable of providing the foundations for understanding how the state governs, what is at stake in its governance, and, importantly, how people resist and engage with state power.
This volume is to pique the interest of many researchers in the fields of infinite dimensional analysis and quantum probability. These fields have undergone increasingly significant developments and have found many new applications, in particular, to classical probability and to different branches of physics. These fields are rather wide and are of a strongly interdisciplinary nature. For such a purpose, we strove to bridge among these interdisciplinary fields in our Workshop on IDAQP and their Applications that was held at the Institute for Mathematical Sciences, National University of Singapore from 3-7 March 2014. Readers will find that this volume contains all the exciting contributions by well-known researchers in search of new directions in these fields.
In almost 60 articles this book reviews the current state of second-order cybernetics and investigates which new research methods second-order cybernetics can offer to tackle wicked problems in science and in society. The contributions explore its application to both scientific fields (such as mathematics, psychology and consciousness research) and non-scientific ones (such as design theory and theater science). The book uses a pluralistic, multifaceted approach to discuss these applications: Each main article is accompanied by several commentaries and author responses, which together allow the reader to discover further perspectives than in the original article alone. This procedure shows that second-order cybernetics is already on its way to becoming an idea shared by many researchers in a variety of disciplines.
This volume is a collection of chapters covering the latest developments in applications of financial mathematics and statistics to topics in energy, commodity financial markets and environmental economics. The research presented is based on the presentations and discussions that took place during the Fields Institute Focus Program on Commodities, Energy and Environmental Finance in August 2013. The authors include applied mathematicians, economists and industry practitioners, providing for a multi-disciplinary spectrum of perspectives on the subject. The volume consists of four sections: Electricity Markets; Real Options; Trading in Commodity Markets; and Oligopolistic Models for Energy Production. Taken together, the chapters give a comprehensive summary of the current state of the art in quantitative analysis of commodities and energy finance. The topics covered include structural models of electricity markets, financialization of commodities, valuation of commodity real options, game-theory analysis of exhaustible resource management and analysis of commodity ETFs. The volume also includes two survey articles that provide a source for new researchers interested in getting into these topics.
This book is devoted to modeling of multi-level complex systems, a challenging domain for engineers, researchers and entrepreneurs, confronted with the transition from learning and adaptability to evolvability and autonomy for technologies, devices and problem solving methods. Chapter 1 introduces the multi-scale and multi-level systems and highlights their presence in different domains of science and technology. Methodologies as, random systems, non-Archimedean analysis, category theory and specific techniques as model categorification and integrative closure, are presented in chapter 2. Chapters 3 and 4 describe polystochastic models, PSM, and their developments. Categorical formulation of integrative closure offers the general PSM framework which serves as a flexible guideline for a large variety of multi-level modeling problems. Focusing on chemical engineering, pharmaceutical and environmental case studies, the chapters 5 to 8 analyze mixing, turbulent dispersion and entropy production for multi-scale systems. Taking inspiration from systems sciences, chapters 9 to 11 highlight multi-level modeling potentialities in formal concept analysis, existential graphs and evolvable designs of experiments. Case studies refer to separation flow-sheets, pharmaceutical pipeline, drug design and development, reliability management systems, security and failure analysis. Perspectives and integrative points of view are discussed in chapter 12. Autonomous and viable systems, multi-agents, organic and autonomic computing, multi-level informational systems, are revealed as promising domains for future applications. Written for: engineers, researchers, entrepreneurs and students in chemical, pharmaceutical, environmental and systems sciences engineering, and for applied mathematicians.
This book is about the definition of the Shannon measure of Information, and some derived quantities such as conditional information and mutual information. Unlike many books, which refer to the Shannon's Measure of information (SMI) as 'Entropy,' this book makes a clear distinction between the SMI and Entropy.In the last chapter, Entropy is derived as a special case of SMI.Ample examples are provided which help the reader in understanding the different concepts discussed in this book. As with previous books by the author, this book aims at a clear and mystery-free presentation of the central concept in Information theory - the Shannon's Measure of Information.This book presents the fundamental concepts of Information theory in a friendly-simple language and is devoid of all kinds of fancy and pompous statements made by authors of popular science books who write on this subject. It is unique in its presentation of Shannon's measure of information, and the clear distinction between this concept and the thermodynamic entropy.Although some mathematical knowledge is required by the reader, the emphasis is on the concepts and their meaning rather on the mathematical details of the theory.
Why are the instruction manuals for cell phones incomprehensible?
Computing Reality is a rare and challenging research output in the area of cybernetic and system theory explaining the meaning behind the understanding, interpretation and application of scientific methodology for knowing scientific truth. The fundamental goal of Computing Reality is to explain how knowledge in scientific investigation can be derived, organized and deciphered in the light of unity of knowledge as the episteme. The book uses these foundational socio-scientific ideas in areas of philosophy of science, economics, society and science and computer modeling to explain specific socio-scientific problems in the light of the foundational conceptions and their application. Computing Reality invites the reader into understanding a fresh new look at the nature of relations between reasoning, science, and society. Special reference is given to certain fundamental issues of economics and world-system in the context of liberalism, globalization and Islam. The technical along with a generalist treatment in the book presents a comprehensive originality of a phenomenological model whose origin lies in a systemic and cybernetic view of unity of knowledge.Some of the new ideas presented here can be of a substantively provocative nature to the serious student, academic and researcher in philosophy of science. The book is nonetheless written for the generalist informed reader as well, enabling the interface with today's increasing consciousness on the relationship between religion, morality, ethics, science and society. The book may be considered as a pioneering contributing to post-modernist criticism of foundational questions of science and society. Computing Reality is a contribution inthe area of system and cybernetic theory examined from the perspective of science and society interrelationship. It goes beyond the modern contributions in this area by proving with conceptual and applied depth the function nature of the phenomenological model of unity of knowledge qua religion, science and society. Masadul A. Choudhury Ph.D., is presently Professor of Economics in the Department of Economics and Finance, College of Commerce and Economics, Sultan Qaboos University, Muscat, Sultanate of Oman; and International Chair of the Postgraduate Program in Islamic Economics and Finance, Trisakti University Jakarta, Indonesia M.Shahadat Hossain, Ph.D. is Associate Professor and Chairman of Computer Science in the Department of Computer Science, Chittagong University, Bangladesh.
The main goal is to offer to readers a panorama of recent progress in nonlinear physics, complexity and transport with attractive chapters readable by a broad audience. It allows to gain an insight into these active fields of research and notably promotes the interdisciplinary studies from mathematics to experimental physics. To reach this aim, the book collects a selection of contributions to the third edition of the CCT conference (Marseilles, 1-5 June 2015).
This book leads readers from a basic foundation to an advanced level understanding of dynamical and complex systems. It is the perfect text for graduate or PhD mathematical-science students looking for support in topics such as applied dynamical systems, Lotka-Volterra dynamical systems, applied dynamical systems theory, dynamical systems in cosmology, aperiodic order, and complex systems dynamics.Dynamical and Complex Systems is the fifth volume of the LTCC Advanced Mathematics Series. This series is the first to provide advanced introductions to mathematical science topics to advanced students of mathematics. Edited by the three joint heads of the London Taught Course Centre for PhD Students in the Mathematical Sciences (LTCC), each book supports readers in broadening their mathematical knowledge outside of their immediate research disciplines while also covering specialized key areas.
The book is a collection of papers of experts in the fields of information and complexity. Information is a basic structure of the world, while complexity is a fundamental property of systems and processes. There are intrinsic relations between information and complexity.The research in information theory, the theory of complexity and their interrelations is very active. The book will expand knowledge on information, complexity and their relations representing the most recent and advanced studies and achievements in this area.The goal of the book is to present the topic from different perspectives - mathematical, informational, philosophical, methodological, etc.
This monograph is a technical survey of concepts and techniques for describing and analyzing large-scale time-series data streams. Some topics covered are algorithms for query by humming, gamma-ray burst detection, pairs trading, and density detection. Included are self-contained descriptions of wavelets, fast Fourier transforms, and sketches as they apply to time-series analysis. Detailed applications are built on a solid scientific basis.
This book aims to synthesize different directions in knowledge studies into a unified theory of knowledge and knowledge processes. It explicates important relations between knowledge and information. It provides the readers with understanding of the essence and structure of knowledge, explicating operations and process that are based on knowledge and vital for society.The book also highlights how the theory of knowledge paves the way for more advanced design and utilization of computers and networks. |
You may like...
Machine Learning for Cyber Physical…
Oliver Niggemann, Christian Kuhnert, …
Hardcover
R1,296
Discovery Miles 12 960
Navigating Information Literacy
Theo Bothma, Erica Cosijn, …
Paperback
|