![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Information theory
Networks surround us, from social networks to protein - protein interaction networks within the cells of our bodies. The theory of random graphs provides a necessary framework for understanding their structure and development. This text provides an accessible introduction to this rapidly expanding subject. It covers all the basic features of random graphs - component structure, matchings and Hamilton cycles, connectivity and chromatic number - before discussing models of real-world networks, including intersection graphs, preferential attachment graphs and small-world models. Based on the authors' own teaching experience, it can be used as a textbook for a one-semester course on random graphs and networks at advanced undergraduate or graduate level. The text includes numerous exercises, with a particular focus on developing students' skills in asymptotic analysis. More challenging problems are accompanied by hints or suggestions for further reading.
The subject of the book is to present the modeling, parameter estimation and other aspects of the identification of nonlinear dynamic systems. The treatment is restricted to the input-output modeling approach. Because of the widespread usage of digital computers discrete time methods are preferred. Time domain parameter estimation methods are dealt with in detail, frequency domain and power spectrum procedures are described shortly. The theory is presented from the engineering point of view, and a large number of examples of case studies on the modeling and identifications of real processes illustrate the methods. Almost all processes are nonlinear if they are considered not merely in a small vicinity of the working point. To exploit industrial equipment as much as possible, mathematical models are needed which describe the global nonlinear behavior of the process. If the process is unknown, or if the describing equations are too complex, the structure and the parameters can be determined experimentally, which is the task of identification. The book is divided into seven chapters dealing with the following topics: 1. Nonlinear dynamic process models 2. Test signals for identification 3. Parameter estimation methods 4. Nonlinearity test methods 5. Structure identification 6. Model validity tests 7. Case studies on identification of real processes Chapter I summarizes the different model descriptions of nonlinear dynamical systems.
This book presents a comprehensive mathematical theory that explains precisely what information flow is, how it can be assessed quantitatively - so bringing precise meaning to the intuition that certain information leaks are small enough to be tolerated - and how systems can be constructed that achieve rigorous, quantitative information-flow guarantees in those terms. It addresses the fundamental challenge that functional and practical requirements frequently conflict with the goal of preserving confidentiality, making perfect security unattainable. Topics include: a systematic presentation of how unwanted information flow, i.e., "leaks", can be quantified in operationally significant ways and then bounded, both with respect to estimated benefit for an attacking adversary and by comparisons between alternative implementations; a detailed study of capacity, refinement, and Dalenius leakage, supporting robust leakage assessments; a unification of information-theoretic channels and information-leaking sequential programs within the same framework; and a collection of case studies, showing how the theory can be applied to interesting realistic scenarios. The text is unified, self-contained and comprehensive, accessible to students and researchers with some knowledge of discrete probability and undergraduate mathematics, and contains exercises to facilitate its use as a course textbook.
This book on complexity science comprises a collection of chapters on methods and principles from a wide variety of disciplinary fields - from physics and chemistry to biology and the social sciences.In this two-part volume, the first part is a collection of chapters introducing different aspects in a coherent fashion, and providing a common basis and the founding principles of the different complexity science approaches; the next provides deeper discussions of the different methods of use in complexity science, with interesting illustrative applications.The fundamental topics deal with self-organization, pattern formation, forecasting uncertainties, synchronization and revolutionary change, self-adapting and self-correcting systems, and complex networks. Examples are taken from biology, chemistry, engineering, epidemiology, robotics, economics, sociology, and neurology.
Dynamic Programming for Impulse Feedback and Fast Controls offers a description of feedback control in the class of impulsive inputs. This book deals with the problem of closed-loop impulse control based on generalization of dynamic programming techniques in the form of variational inequalities of the Hamilton-Jacobi-Bellman type. It provides exercises and examples in relation to software, such as techniques for regularization of ill-posed problems. It also gives an introduction to applications such as hybrid dynamics, control in arbitrary small time, and discontinuous trajectories.This book walks the readers through: the design and description of feedback solutions for impulse controls; the explanation of impulses of higher order that are derivatives of delta functions; the description of their physically realizable approximations - the fast controls and their approximations; the treatment of uncertainty in impulse control and the applications of impulse feedback. Of interest to both academics and graduate students in the field of control theory and applications, the book also protects users from common errors , such as inappropriate solution attempts, by indicating Hamiltonian techniques for hybrid systems with resets.
This book reports on the development and assessment of a novel framework for studying neural interactions (the connectome) and their dynamics (the chronnectome). Using EEG recordings taken during an auditory oddball task performed by 48 patients with schizophrenia and 87 healthy controls, and applying local and network measures, changes in brain activation from pre-stimulus to cognitive response were assessed, and significant differences were observed between the patients and controls. This book investigates the source of the network abnormalities and presents new evidence for the disconnection hypothesis and the aberrant salience hypothesis with regard to schizophrenia. Moreover, it puts forward a novel approach to combining local regularity measures and graph measures in order to characterize schizophrenia brain dynamics, and presents interesting findings on the regularity of brain patterns in healthy control subjects versus patients with schizophrenia. Besides providing new evidence for the disconnection hypothesis, it offers a source of inspiration for future research directions in the field.
Learn to solve the unprecedented challenges facing Online Learning and Adaptive Signal Processing in this concise, intuitive text. The ever-increasing amount of data generated every day requires new strategies to tackle issues such as: combining data from a large number of sensors; improving spectral usage, utilizing multiple-antennas with adaptive capabilities; or learning from signals placed on graphs, generating unstructured data. Solutions to all of these and more are described in a condensed and unified way, enabling you to expose valuable information from data and signals in a fast and economical way. The up-to-date techniques explained here can be implemented in simple electronic hardware, or as part of multi-purpose systems. Also featuring alternative explanations for online learning, including newly developed methods and data selection, and several easily implemented algorithms, this one-of-a-kind book is an ideal resource for graduate students, researchers, and professionals in online learning and adaptive filtering.
Winner of the Neumann Prize for the History of Mathematics. In their second collaboration, biographers Jimmy Soni and Rob Goodman present the story of Claude Shannon—one of the foremost intellects of the twentieth century and the architect of the Information Age, whose insights stand behind every computer built, email sent, video streamed, and webpage loaded. Claude Shannon was a groundbreaking polymath, a brilliant tinkerer, and a digital pioneer. He constructed the first wearable computer, outfoxed Vegas casinos, and built juggling robots. He also wrote the seminal text of the digital revolution, which has been called “the Magna Carta of the Information Age.” In this elegantly written, exhaustively researched biography, Soni and Goodman reveal Claude Shannon’s full story for the first time. With unique access to Shannon’s family and friends, A Mind at Play brings this singular innovator and always playful genius to life.
Discover the foundations of quantum mechanics, and explore how these principles are powering a new generation of advances in quantum engineering, in this ground-breaking undergraduate textbook. It explains physical and mathematical principles using cutting-edge electronic, optoelectronic and photonic devices, linking underlying theory with real-world applications; focuses on current technologies and avoids historic approaches, getting students quickly up-to-speed to tackle contemporary engineering challenges; provides an introduction to the foundations of quantum information, and a wealth of real-world quantum examples, including quantum well infrared photodetectors, solar cells, quantum teleportation, quantum computing, band gap engineering, quantum cascade lasers, low-dimensional materials, and van der Waals heterostructures; and includes pedagogical features such as objectives and end-of-chapter homework problems to consolidate student understanding, and solutions for instructors. Designed to inspire the development of future quantum devices and systems, this is the perfect introduction to quantum mechanics for undergraduate electrical engineers and materials scientists.
This compact course is written for the mathematically literate reader who wants to learn to analyze data in a principled fashion. The language of mathematics enables clear exposition that can go quite deep, quite quickly, and naturally supports an axiomatic and inductive approach to data analysis. Starting with a good grounding in probability, the reader moves to statistical inference via topics of great practical importance - simulation and sampling, as well as experimental design and data collection - that are typically displaced from introductory accounts. The core of the book then covers both standard methods and such advanced topics as multiple testing, meta-analysis, and causal inference.
At the dawn of the twenty-first century, education about and through the media has become a worldwide phenomenon, and is playing an increasingly important role in educational reform. The theory and practice of media education have profited greatly from recent and intensive development and application of new information and telecommunications technologies. Consequently, the importance of media and information literacy is taking on an even greater urgency. With this in mind, the contributors to this volume survey what has taken place over the last decade in different parts of the world, examine the current state of theoretical, conceptual, and research development, and consider where media education is going and where it ought to go. With two-thirds of its 22 contributions coming from outside the United States, "Media Literacy in the Information Age" is a genuine international effort, with many leading media and information educators in the world taking part. The work converts the notion of globalism from a slogan into a working hypothesis. The concerns in this volume are with literacy not just in computer technology, but as a broad concern of the educational process.
Compound renewal processes (CRPs) are among the most ubiquitous models arising in applications of probability. At the same time, they are a natural generalization of random walks, the most well-studied classical objects in probability theory. This monograph, written for researchers and graduate students, presents the general asymptotic theory and generalizes many well-known results concerning random walks. The book contains the key limit theorems for CRPs, functional limit theorems, integro-local limit theorems, large and moderately large deviation principles for CRPs in the state space and in the space of trajectories, including large deviation principles in boundary crossing problems for CRPs, with an explicit form of the rate functionals, and an extension of the invariance principle for CRPs to the domain of moderately large and small deviations. Applications establish the key limit laws for Markov additive processes, including limit theorems in the domains of normal and large deviations.
"SSM offers an elegantly simple approach that is both powerful, yet non-threatening and one that forces organisations to confront questions essential to their very survival such as, "Are we doing the right thing?" Since its inception more than thirty years ago, the benefits of using Soft Systems Methodology for problem solving has gained worldwide recognition. Yet, despite recognising the importance of SSM, students and practitioners still experience considerable difficulty with the intellectual process involved. Based on a lifetime experience as an academic and consultant, Brian Wilson provides guidance on how to develop a range of conceptual models across a variety of business problems. Building on his earlier work in Systems: Concepts, Methodologies and Applications he takes a practical approach to the topic based on the premise that all organisations are unique. He develops concepts to articulate ways of thinking about complexity. These are an alternative to mathematically-based concepts, and they offer rigorous, and defensible ways of answering the question 'What do we take the organisation to be?' A model of the most appropriate and relevant concept for your own organisation can then be successfully developed and applied. Of relevance to organisations of any type, or any size, this book shows how model building within SSM can be used to cope with real-life problems. It will be an invaluable resource for students and practitioners in both the public and private sectors.
This is the first full-length book on the major theme of symmetry in graphs. Forming part of algebraic graph theory, this fast-growing field is concerned with the study of highly symmetric graphs, particularly vertex-transitive graphs, and other combinatorial structures, primarily by group-theoretic techniques. In practice the street goes both ways and these investigations shed new light on permutation groups and related algebraic structures. The book assumes a first course in graph theory and group theory but no specialized knowledge of the theory of permutation groups or vertex-transitive graphs. It begins with the basic material before introducing the field's major problems and most active research themes in order to motivate the detailed discussion of individual topics that follows. Featuring many examples and over 450 exercises, it is an essential introduction to the field for graduate students and a valuable addition to any algebraic graph theorist's bookshelf.
Searching for Trust explores the intersection of trust, disinformation, and blockchain technology in an age of heightened institutional and epistemic mistrust. It adopts a unique archival theoretic lens to delve into how computational information processing has gradually supplanted traditional record keeping, putting at risk a centuries-old tradition of the 'moral defense of the record' and replacing it with a dominant ethos of information-processing efficiency. The author argues that focusing on information-processing efficiency over the defense of records against manipulation and corruption (the ancient task of the recordkeeper) has contributed to a diminution of the trustworthiness of information and a rise of disinformation, with attendant destabilization of the epistemic trust fabric of societies. Readers are asked to consider the potential and limitations of blockchains as the technological embodiment of the moral defense of the record and as means to restoring societal trust in an age of disinformation.
Searching for Trust explores the intersection of trust, disinformation, and blockchain technology in an age of heightened institutional and epistemic mistrust. It adopts a unique archival theoretic lens to delve into how computational information processing has gradually supplanted traditional record keeping, putting at risk a centuries-old tradition of the 'moral defense of the record' and replacing it with a dominant ethos of information-processing efficiency. The author argues that focusing on information-processing efficiency over the defense of records against manipulation and corruption (the ancient task of the recordkeeper) has contributed to a diminution of the trustworthiness of information and a rise of disinformation, with attendant destabilization of the epistemic trust fabric of societies. Readers are asked to consider the potential and limitations of blockchains as the technological embodiment of the moral defense of the record and as means to restoring societal trust in an age of disinformation.
The mathematical theory of "open" dynamical systems is a creation of the twentieth century. Its humble beginnings focused on ideas of Laplace transforms applied to linear problems of automatic control and to the analysis and synthesis of electrical circuits. However during the second half of the century, it flowered into a field based on an array of sophisticated mathematical concepts and techniques from algebra, nonlinear analysis and differential geometry. The central notion is that of a dynamical system that exchanges matter, energy, or information with its surroundings, i.e. an "open" dynamical system. The mathema tization of this notion evolved considerably over the years. The early development centered around the input/output point of view and led to important results, particularly in controller design. Thinking about open systems as a "black box" that accepts stimuli and produces responses has had a wide influence also in areas outside engineering, for example in biology, psychology, and economics. In the early 1960's, especially through the work of Kalman, input/state/output models came in vogue. This model class accommodates very nicely the internal initial conditions that are essentially always present in a dynamical system. The introduction of input/state/output models led to a tempestuous development that made systems and control into a mature discipline with a wide range of concepts, results, algorithms, and applications.
The definitive guide to control system design Modern Control System Theory and Design, Second Edition offers the most comprehensive treatment of control systems available today. Its unique text/software combination integrates classical and modern control system theories, while promoting an interactive, computer-based approach to design solutions. The sheer volume of practical examples, as well as the hundreds of illustrations of control systems from all engineering fields, make this volume accessible to students and indispensable for professional engineers. This fully updated Second Edition features a new chapter on modern control system design, including state-space design techniques, Ackermann's formula for pole placement, estimation, robust control, and the H method for control system design. Other notable additions to this edition are:
Superbly organized and easy-to-use, Modern Control System Theory and Design, Second Edition is an ideal textbook for introductory courses in control systems and an excellent professional reference. Its interdisciplinary approach makes it invaluable for practicing engineers in electrical, mechanical, aeronautical, chemical, and nuclear engineering and related areas.
The extraordinary growth of the Internet and other information age developments in communications, particularly marketing communications, are transforming the international and national business environments. This work is designed to help people, whether with large corporations or the smallest of enterprises, to venture with confidence into cyberspace. Written in non-technical language by a businessman for other business people, the information seizes on the many opportunities emerging for enterprises and personal career development. The socio-economic impact is covered from important perspectives, such as the impact on the Third World. The book heralds an age in which information becomes an internationally marketable commodity, and gives individual entrepreneurs on miniscule budgets the tools to compete against the big corporate guns on a level playing field. The appendices include useful electronic mailing lists, a product protection and order delivery system on the Web, and details on getting connected in the UK.
Strongly regular graphs lie at the intersection of statistical design, group theory, finite geometry, information and coding theory, and extremal combinatorics. This monograph collects all the major known results together for the first time in book form, creating an invaluable text that researchers in algebraic combinatorics and related areas will refer to for years to come. The book covers the theory of strongly regular graphs, polar graphs, rank 3 graphs associated to buildings and Fischer groups, cyclotomic graphs, two-weight codes and graphs related to combinatorial configurations such as Latin squares, quasi-symmetric designs and spherical designs. It gives the complete classification of rank 3 graphs, including some new constructions. More than 100 graphs are treated individually. Some unified and streamlined proofs are featured, along with original material including a new approach to the (affine) half spin graphs of rank 5 hyperbolic polar spaces.
For 80 years, mathematics has driven fundamental innovation in computing and communications. This timely book provides a panorama of some recent ideas in mathematics and how they will drive continued innovation in computing, communications and AI in the coming years. It provides a unique insight into how the new techniques that are being developed can be used to provide theoretical foundations for technological progress, just as mathematics was used in earlier times by Turing, von Neumann, Shannon and others. Edited by leading researchers in the field, chapters cover the application of new mathematics in computer architecture, software verification, quantum computing, compressed sensing, networking, Bayesian inference, machine learning, reinforcement learning and many other areas.
There is a need for general theoretical principles
describing/explaining effective design -- those which demonstrate
"unity" and enhance comprehension and usability. Theories of
cohesion from linguistics and of comprehension in psychology are
likely sources of such general principles. Unfortunately,
linguistic approaches to discourse unity have focused exclusively
on semantic elements such as synonymy or anaphora, and have ignored
other linguistic elements such as syntactic parallelism and
phonological alliteration. They have also overlooked the
non-linguistic elements -- visual factors such as typography or
color, and auditory components such as pitch or duration. In
addition, linguistic approaches have met with criticism because
they have failed to explain the relationship between semantic
cohesive elements and coherence. On the other hand, psychological
approaches to discourse comprehension have considered the impact of
a wider range of discourse elements -- typographical cuing of key
terms to enhance comprehension -- but have failed to provide
general theoretical explanations for such observations.
This lucid, accessible introduction to supervised machine learning presents core concepts in a focused and logical way that is easy for beginners to follow. The author assumes basic calculus, linear algebra, probability and statistics but no prior exposure to machine learning. Coverage includes widely used traditional methods such as SVMs, boosted trees, HMMs, and LDAs, plus popular deep learning methods such as convolution neural nets, attention, transformers, and GANs. Organized in a coherent presentation framework that emphasizes the big picture, the text introduces each method clearly and concisely "from scratch" based on the fundamentals. All methods and algorithms are described by a clean and consistent style, with a minimum of unnecessary detail. Numerous case studies and concrete examples demonstrate how the methods can be applied in a variety of contexts.
This book provides a comprehensive explanation of forward error correction, which is a vital part of communication systems. The book is written in such a way to make the subject easy and understandable for the reader. The book starts with a review of linear algebra to provide a basis for the text. The author then goes on to cover linear block codes, syndrome error correction, cyclic codes, Galois fields, BCH codes, Reed Solomon codes, and convolutional codes. Examples are provided throughout the text. |
You may like...
Abstraction in Artificial Intelligence…
Lorenza Saitta, Jean-Daniel Zucker
Hardcover
R3,867
Discovery Miles 38 670
Data Analysis and Pattern Recognition in…
Animesh Adhikari, Jhimli Adhikari, …
Hardcover
Natural Computing for Unsupervised…
Xiangtao Li, Ka-Chun Wong
Hardcover
R2,677
Discovery Miles 26 770
Multimedia Data Mining and Analytics…
Aaron K Baughman, Jiang Gao, …
Hardcover
Information and Communication Technology…
Simon Fong, Shyam Akashe, …
Hardcover
R5,308
Discovery Miles 53 080
|