![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Reference & Interdisciplinary > Communication studies > Information theory
Today, our cities are an embodiment of the complex, historical evolution of knowledge, desires and technology. Our planned and designed activities co-evolve with our aspirations, mediated by the existing technologies and social structures. The city represents the accretion and accumulation of successive layers of collective activity, structuring and being structured by other, increasingly distant cities, reaching now right around the globe. This historical and structural development cannot therefore be understood or captured by any set of fixed quantitative relations. Structural changes imply that the patterns of growth, and their underlying reasons change over time, and therefore that any attempt to control the morphology of cities and their patterns of flow by means of planning and design, must be dynamical, based on the mechanisms that drive the changes occurring at a given moment. This carefully edited post-proceedings volume gathers a snapshot view by leading researchers in field, of current complexity theories of cities. In it, the achievements, criticisms and potentials yet to be realized are reviewed and the implications to planning and urban design are assessed."
This volume is the first book describing the new concept of "Mixed Reality" which is a kind of virtual reality in a broader sense. Published as the proceedings of the first International Symposium on Mixed Reality and written by an interdisciplinary group of experts from all over the world in both industry and academia, this book provides an in-depth look at the current state of mixed reality technology and the scope of its use in entertainment and interactive arts, as well as in engineering and medical applications. Because of the inherent interdisciplinary applications of the mixed reality technology, this book will be useful for computer scientists in computer graphics, computer vision, human computer interaction, and multimedia technologies, and for people involved in cinema/movie, architecture/civil engineering, medical informatics, and interactive entertainment.
This book provides the foundations for a rigorous theory of functional analysis with bicomplex scalars. It begins with a detailed study of bicomplex and hyperbolic numbers and then defines the notion of bicomplex modules. After introducing a number of norms and inner products on such modules (some of which appear in this volume for the first time), the authors develop the theory of linear functionals and linear operators on bicomplex modules. All of this may serve for many different developments, just like the usual functional analysis with complex scalars and in this book it serves as the foundational material for the construction and study of a bicomplex version of the well known Schur analysis.
Luciano Floridi develops an original ethical framework for dealing with the new challenges posed by Information and Communication Technologies (ICTs). ICTs have profoundly changed many aspects of life, including the nature of entertainment, work, communication, education, health care, industrial production and business, social relations, and conflicts. They have had a radical and widespread impact on our moral lives and on contemporary ethical debates. Privacy, ownership, freedom of speech, responsibility, technological determinism, the digital divide, and pornography online are only some of the pressing issues that characterise the ethical discourse in the information society. They are the subject of Information Ethics (IE), the new philosophical area of research that investigates the ethical impact of ICTs on human life and society. Since the seventies, IE has been a standard topic in many curricula. In recent years, there has been a flourishing of new university courses, international conferences, workshops, professional organizations, specialized periodicals and research centres. However, investigations have so far been largely influenced by professional and technical approaches, addressing mainly legal, social, cultural and technological problems. This book is the first philosophical monograph entirely and exclusively dedicated to it. Floridi lays down, for the first time, the conceptual foundations for IE. He does so systematically, by pursuing three goals: a) a metatheoretical goal: it describes what IE is, its problems, approaches and methods; b) an introductory goal: it helps the reader to gain a better grasp of the complex and multifarious nature of the various concepts and phenomena related to computer ethics; c) an analytic goal: it answers several key theoretical questions of great philosophical interest, arising from the investigation of the ethical implications of ICTs. Although entirely independent of The Philosophy of Information (OUP, 2011), Floridi's previous book, The Ethics of Information complements it as new work on the foundations of the philosophy of information.
This book provides readers with a concise introduction to current studies on operator-algebras and their generalizations, operator spaces and operator systems, with a special focus on their application in quantum information science. This basic framework for the mathematical formulation of quantum information can be traced back to the mathematical work of John von Neumann, one of the pioneers of operator algebras, which forms the underpinning of most current mathematical treatments of the quantum theory, besides being one of the most dynamic areas of twentieth century functional analysis. Today, von Neumann's foresight finds expression in the rapidly growing field of quantum information theory. These notes gather the content of lectures given by a very distinguished group of mathematicians and quantum information theorists, held at the IMSc in Chennai some years ago, and great care has been taken to present the material as a primer on the subject matter. Starting from the basic definitions of operator spaces and operator systems, this text proceeds to discuss several important theorems including Stinespring's dilation theorem for completely positive maps and Kirchberg's theorem on tensor products of C*-algebras. It also takes a closer look at the abstract characterization of operator systems and, motivated by the requirements of different tensor products in quantum information theory, the theory of tensor products in operator systems is discussed in detail. On the quantum information side, the book offers a rigorous treatment of quantifying entanglement in bipartite quantum systems, and moves on to review four different areas in which ideas from the theory of operator systems and operator algebras play a natural role: the issue of zero-error communication over quantum channels, the strong subadditivity property of quantum entropy, the different norms on quantum states and the corresponding induced norms on quantum channels, and, lastly, the applications of matrix-valued random variables in the quantum information setting.
Building upon a long tradition of scientifi c conferences dealing with problems of reliability in technical systems, in 2006 Department of Computer Engineering at Wroclaw University of Technology established DepCoS-RELCOMEX series of events in order to promote a comprehensive approach to evaluation of system performability which is now commonly called dependability. Contemporary complex systems integrate variety of technical, information, soft ware and human (users, administrators and management) resources. Their complexity comes not only from involved technical and organizational structures but mainly from complexity of information processes that must be implemented in specific operational environment (data processing, monitoring, management, etc.). In such a case traditional methods of reliability evaluation focused mainly on technical levels are insufficient and more innovative, multidisciplinary methods of dependability analysis must be applied. Selection of submissions for these proceedings exemplify diversity of topics that must be included in such analyses: tools, methodologies and standards for modelling, design and simulation of the systems, security and confidentiality in information processing, specific issues of heterogeneous, today often wireless, computer networks, or management of transportation networks. In addition, this edition of the conference hosted the 5th CrISS-DESSERT Workshop devoted to the problems of security and safety in critical information systems.
The papers in this volume present an overview of the general aspects and practical applications of dynamic inverse methods, through the interaction of several topics, ranging from classical and advanced inverse problems in vibration, isospectral systems, dynamic methods for structural identification, active vibration control and damage detection, imaging shear stiffness in biological tissues, wave propagation, to computational and experimental aspects relevant for engineering problems.
"Energy Methods in Dynamics "is a textbook based on the lectures given by the first author at Ruhr University Bochum, Germany. Its aim is to help students acquire both a good grasp of the first principles from which the governing equations can be derived, and the adequate mathematical methods for their solving. Its distinctive features, as seen from the title, lie in the systematic and intensive use of Hamilton's variational principle and its generalizations for deriving the governing equations of conservative and dissipative mechanical systems, and also in providing the direct variational-asymptotic analysis, whenever available, of the energy and dissipation for the solution of these equations. It demonstrates that many well-known methods in dynamics like those of Lindstedt-Poincare, Bogoliubov-Mitropolsky, Kolmogorov-Arnold-Moser (KAM), Wentzel Kramers Brillouin (WKB), and Whitham are derivable from this variational-asymptotic analysis. This second edition includes the solutions to all exercises as well as some new materials concerning amplitude and slope modulations of nonlinear dispersive waves."
Control of Linear Parameter Varying Systems compiles state-of-the-art contributions on novel analytical and computational methods for addressing system identification, model reduction, performance analysis and feedback control design and addresses address theoretical developments, novel computational approaches and illustrative applications to various fields. Part I discusses modeling and system identification of linear parameter varying systems, Part II covers the importance of analysis and control design when working with linear parameter varying systems (LPVS) , Finally, Part III presents an applications based approach to linear parameter varying systems, including modeling of a turbocharged diesel engines, Multivariable control of wind turbines, modeling and control of aircraft engines, control of an autonomous underwater vehicles and analysis and synthesis of re-entry vehicles.
A knowledge of linear systems provides a firm foundation for the study of optimal control theory and many areas of system theory and signal processing. State-space techniques developed since the early sixties have been proved to be very effective. The main objective of this book is to present a brief and somewhat complete investigation on the theory of linear systems, with emphasis on these techniques, in both continuous-time and discrete-time settings, and to demonstrate an application to the study of elementary (linear and nonlinear) optimal control theory. An essential feature of the state-space approach is that both time-varying and time-invariant systems are treated systematically. When time-varying systems are considered, another important subject that depends very much on the state-space formulation is perhaps real-time filtering, prediction, and smoothing via the Kalman filter. This subject is treated in our monograph entitled "Kalman Filtering with Real-Time Applications" published in this Springer Series in Information Sciences (Volume 17). For time-invariant systems, the recent frequency domain approaches using the techniques of Adamjan, Arov, and Krein (also known as AAK), balanced realization, and oo H theory via Nevanlinna-Pick interpolation seem very promising, and this will be studied in our forthcoming monograph entitled "Mathematical Ap proach to Signal Processing and System Theory." The present elementary treatise on linear system theory should provide enough engineering and mathe of these two subjects."
This textbook contains the essential knowledge in modeling, simulation, analysis, and applications in dealing with biological cellular control systems. In particular, the book shows how to use the law of mass balance and the law of mass action to derive an enzyme kinetic model - the Michaelis-Menten function or the Hill function, how to use a current-voltage relation, Nernst potential equilibrium equation, and Hodgkin and Huxley's models to model an ionic channel or pump, and how to use the law of mass balance to integrate these enzyme or channel models into a complete feedback control system. The book also illustrates how to use data to estimate parameters in a model, how to use MATLAB to solve a model numerically, how to do computer simulations, and how to provide model predictions. Furthermore, the book demonstrates how to conduct a stability and sensitivity analysis on a model.
This book contains all refereed papers that were accepted to the second edition of the " Complex Systems Design & Management " (CSDM 2011) international conference that took place in Paris (France) from December 7 to December 9, 2011. (Website: http://www.csdm2011.csdm.fr/). These proceedings cover the most recent trends in the emerging field of complex systems sciences & practices from an industrial and academic perspective, including the main industrial domains (transport, defense & security, electronics, energy & environment, e-services), scientific & technical topics (systems fundamentals, systems architecture& engineering, systems metrics & quality, systemic tools) and system types (transportation systems, embedded systems, software & information systems, systems of systems, artificial ecosystems). The CSDM 2011 conference is organized under the guidance of the CESAMES non-profit organization (http://www.cesames.net/).
This book is dedicated to the memory of Israel Gohberg (1928-2009) - one of the great mathematicians of our time - who inspired innumerable fellow mathematicians and directed many students. The volume reflects the wide spectrum of Gohberg's mathematical interests. It consists of more than 25 invited and peer-reviewed original research papers written by his former students, co-authors and friends. Included are contributions to single and multivariable operator theory, commutative and non-commutative Banach algebra theory, the theory of matrix polynomials and analytic vector-valued functions, several variable complex function theory, and the theory of structured matrices and operators. Also treated are canonical differential systems, interpolation, completion and extension problems, numerical linear algebra and mathematical systems theory.
Chaos and nonlinear dynamics initially developed as a new emergent field with its foundation in physics and applied mathematics. The highly generic, interdisciplinary quality of the insights gained in the last few decades has spawned myriad applications in almost all branches of science and technology-and even well beyond. Wherever quantitative modeling and analysis of complex, nonlinear phenomena is required, chaos theory and its methods can play a key role. This third volume concentrates on reviewing further relevant contemporary applications of chaotic nonlinear systems as they apply to the various cutting-edge branches of engineering. This encompasses, but is not limited to, topics such fluctuation relations and chaotic dynamics in physics, fractals and their applications in epileptic seizures, as well as chaos synchronization. Featuring contributions from active and leading research groups, this collection is ideal both as a reference and as a 'recipe book' full of tried and tested, successful engineering applications.
This volume describes mesoscopic systems with classically chaotic dynamics using semiclassical methods which combine elements of classical dynamics and quantum interference effects. Experiments and numerical studies show that Random Matrix Theory (RMT) explains physical properties of these systems well. This was conjectured more than 25 years ago by Bohigas, Giannoni and Schmit for the spectral properties. Since then, it has been a challenge to understand this connection analytically. The author offers his readers a clearly-written and up-to-date treatment of the topics covered. He extends previous semiclassical approaches that treated spectral and conductance properties. He shows that RMT results can in general only be obtained semiclassically when taking into account classical configurations not considered previously, for example those containing multiply traversed periodic orbits. Furthermore, semiclassics is capable of describing effects beyond RMT. In this context he studies the effect of a non-zero Ehrenfest time, which is the minimal time needed for an initially spatially localized wave packet to show interference. He derives its signature on several quantities characterizing mesoscopic systems, e. g. dc and ac conductance, dc conductance variance, n-pair correlation functions of scattering matrices and the gap in the density of states of Andreev billiards.
" Models of Science Dynamics aims to capture the structure and evolution of science, the emerging arena in which scholars, science and the communication of science become themselves the basic objects of research. In order to capture the essence of phenomena as diverse as the structure of co-authorship networks or the evolution of citation diffusion patterns, such models can be represented by conceptual models based on historical and ethnographic observations, mathematical descriptions of measurable phenomena, or computational algorithms. Despite its evident importance, the mathematical modeling of science still lacks a unifying framework and a comprehensive study of the topic. This volume fills this gap, reviewing and describing major threads in the mathematical modeling of science dynamics for a wider academic and professional audience. The model classes presented cover stochastic and statistical models, system-dynamics approaches, agent-based simulations, population-dynamics models, and complex-network models. The book comprises an introduction and a foundational chapter that defines and operationalizes terminology used in the study of science, as well as a review chapter that discusses the history of mathematical approaches to modeling science from an algorithmic-historiography perspective. It concludes with a survey of remaining challenges for future science models and their relevance for science and science policy."
Solid Freeform Fabrication is a set of manufacturing processes that are capable of producing complex freeform solid objects directly from a computer model of an object without part-specific tooling or knowledge. In essence, these methods are miniature manufacturing plants which come complete with material handling, information processing and materials processing. As such, these methods require technical knowledge from many disciplines; therefore, researchers, engineers, and students in Mechanical, Chemical, Electrical, and Manufacturing Engineering and Materials and Computer Science will all find some interest in this subject. Particular subareas of concern include manufacturing methods, polymer chemistry, computational geometry, control, heat transfer, metallurgy, ceramics, optics, and fluid mechanics. History of technology specialists may also find Chapter 1 of interest. Although this book covers the spectrum of different processes, the emphasis is clearly on the area in which the authors have the most experience, thermal laser processing. In particular, the authors have all been developers and inventors of techniques for the Selective Laser Sintering process and laser gas phase techniques (Selective Area Laser Deposition). This is a research book on the subject of Solid Freeform Fabrication.
This book presents and extend different known methods to solve
different types of strong nonlinearities encountered by engineering
systems. A better knowledge of the classical methods presented in
the first part lead to a better choice of the so-called base
functions . These are absolutely necessary to obtain the auxiliary
functions involved in the optimal approaches which are presented in
the second part.
Computational Neuroscience - A First Course provides an essential introduction to computational neuroscience and equips readers with a fundamental understanding of modeling the nervous system at the membrane, cellular, and network level. The book, which grew out of a lecture series held regularly for more than ten years to graduate students in neuroscience with backgrounds in biology, psychology and medicine, takes its readers on a journey through three fundamental domains of computational neuroscience: membrane biophysics, systems theory and artificial neural networks. The required mathematical concepts are kept as intuitive and simple as possible throughout the book, making it fully accessible to readers who are less familiar with mathematics. Overall, Computational Neuroscience - A First Course represents an essential reference guide for all neuroscientists who use computational methods in their daily work, as well as for any theoretical scientist approaching the field of computational neuroscience.
Complex System Reliability presents a state-of-the-art treatment of complex multi-channel system reliability assessment and provides the requisite tools, techniques and algorithms required for designing, evaluating and optimizing ultra-reliable redundant systems. Critical topics that make Complex System Reliability a unique and definitive resource include: * redundant system analysis for k-out-of-n systems (including complex systems with embedded k-out-of-n structures) involving both perfect and imperfect fault coverage; * imperfect fault coverage analysis techniques, including algorithms for assessing the reliability of redundant systems in which each element is subject to a given coverage value (element level coverage) or in which the system uses voting to avoid the effects of a failed element (fault level coverage); and * state-of-the-art binary decision diagram analysis techniques, including the latest and most efficient algorithms for the reliability assessment of large, complex redundant systems. This practical presentation includes numerous fully worked examples that provide detailed explanations of both the underlying design principles and the techniques (such as combinatorial, recursive and binary decision diagram algorithms) used to obtain quantitative results. Many of the worked examples are based on the design of modern digital fly-by-wire control system technology. Complex System Reliability provides in-depth coverage of systems subject to either perfect or imperfect fault coverage and also the most recent techniques for correctly assessing the reliability of redundant systems that use mid-value-select voting as their primary means of redundancy management. It is a valuable resource for those involved in the design and reliability assessment of highly reliable systems, particularly in the aerospace and automotive sectors.
Networked control systems are increasingly ubiquitous today, with applications ranging from vehicle communication and adaptive power grids to space exploration and economics. The optimal design of such systems presents major challenges, requiring tools from various disciplines within applied mathematics such as decentralized control, stochastic control, information theory, and quantization. A thorough, self-contained book, Stochastic Networked Control Systems: Stabilization and Optimization under Information Constraints aims to connect these diverse disciplines with precision and rigor, while conveying design guidelines to controller architects. Unique in the literature, it lays a comprehensive theoretical foundation for the study of networked control systems, and introduces an array of concrete tools for work in the field. Salient features included: * Characterization, comparison and optimal design of information structures in static and dynamic teams. Operational, structural and topological properties of information structures in optimal decision making, with a systematic program for generating optimal encoding and control policies. The notion of signaling, and its utilization in stabilization and optimization of decentralized control systems. * Presentation of mathematical methods for stochastic stability of networked control systems using random-time, state-dependent drift conditions and martingale methods. * Characterization and study of information channels leading to various forms of stochastic stability such as stationarity, ergodicity, and quadratic stability; and connections with information and quantization theories. Analysis of various classes of centralized and decentralized control systems. * Jointly optimal design of encoding and control policies over various information channels and under general optimization criteria, including a detailed coverage of linear-quadratic-Gaussian models. * Decentralized agreement and dynamic optimization under information constraints. This monograph is geared toward a broad audience of academic and industrial researchers interested in control theory, information theory, optimization, economics, and applied mathematics. It could likewise serve as a supplemental graduate text. The reader is expected to have some familiarity with linear systems, stochastic processes, and Markov chains, but the necessary background can also be acquired in part through the four appendices included at the end. * Characterization, comparison and optimal design of information structures in static and dynamic teams. Operational, structural and topological properties of information structures in optimal decision making, with a systematic program for generating optimal encoding and control policies. The notion of signaling, and its utilization in stabilization and optimization of decentralized control systems. * Presentation of mathematical methods for stochastic stability of networked control systems using random-time, state-dependent drift conditions and martingale methods. * Characterization and study of information channels leading to various forms of stochastic stability such as stationarity, ergodicity, and quadratic stability; and connections with information and quantization theories. Analysis of various classes of centralized and decentralized control systems. * Jointly optimal design of encoding and control policies over various information channels and under general optimization criteria, including a detailed coverage of linear-quadratic-Gaussian models. * Decentralized agreement and dynamic optimization under information constraints. This monograph is geared toward a broad audience of academic and industrial researchers interested in control theory, information theory, optimization, economics, and applied mathematics. It could likewise serve as a supplemental graduate text. The reader is expected to have some familiarity with linear systems, stochastic processes, and Markov chains, but the necessary background can also be acquired in part through the four appendices included at the end.
This work proposes an answer to the question: what are computers for? It analyzes human activity and its relevance to computer use and interleaves a theory about the universal aspect of social life with a vision of how to harness computer power. Though technical in spirit and method, this book does not expect significant prior computer knowledge of the reader.
This book presents a comprehensive and consistent theory of estimation. The framework described leads naturally to a generalized maximum capacity estimator. This approach allows the optimal estimation of real-valued parameters, their number and intervals, as well as providing common ground for explaining the power of these estimators. Beginning with a review of coding and the key properties of information, the author goes on to discuss the techniques of estimation and develops the generalized maximum capacity estimator, based on a new form of Shannon's mutual information and channel capacity. Applications of this powerful technique in hypothesis testing and denoising are described in detail. Offering an original and thought-provoking perspective on estimation theory, Jorma Rissanen's book is of interest to graduate students and researchers in the fields of information theory, probability and statistics, econometrics and finance.
This book comprises a selection of papers presented at a symposium organized under the aegis of COST Telecommunications Action 285. The main objective of the book is to enhance existing tools and develop new modeling and simulation tools for research in emerging multi-service telecommunication networks in the areas of model performance improvements, multilayer traffic modeling, and the important issue of evaluation and validation of the new modeling tools.
Khaled Fazel Stefan Kaiser Radio System Design DoCoMo Euro-Labs Marconi Communications Landsberger Strasse 312 D-71522 Backnang, Germany D-80687 Munich, Germany The field of multi-carrier and spread spectrum communications has became an important research topic with increasing number of research activities [1]. Especially in the last two years, beside deep system analysis of various multiple access schemes, new standardization activities in the framework of beyond 3G (B3G) concepts have been initiated. Multi-carrier transmission is considered to be a potential candidate to fulfil the requirements of the next generation system. The two important requirements of B3G/4G can be summarized as: i) much higher data rate for cellular mobile radio and ii) a unique physical layer specification for indoor/hot spot and outdoor/cellular applications, including fixed wireless access (FWA) schemes. The activities within the 3GPP and WiMAX fora are examples of such trends (see Fig. 1). IEEE 802 ETSI WAN UMTS, EDGE 3GPP (GSM) HiperMAN & IEEE 802. 16 WiMAX MAN HiperAccess WirelessMAN HiperLAN/2 IEEE 802. 11 LAN WiFi RLAN WirelessLAN IEEE 802. 15 PAN Bluetooth BRAN Figure 1 Beyond 3G: Worldwide Standardization Activities xii Editorial Introduction The WiMAX (Worldwide Interoperability for Microwave Access [2]) vision is to provide broadband wireless access with its primary goal to promote IEEE 802. 16a-e and ETSI-BRAN standards through interoperability testing and certification. In the first step the broadband access to the so-called last mile applications with fixed positioned terminals is envisaged. |
You may like...
Curriculum, Pedagogy And Assessment - A…
Hasina Ebrahim, Manjula Waniganayake
Paperback
R736
Discovery Miles 7 360
Implementing Management Innovations…
Shannon W. Anderson, S. Mark Young
Hardcover
R2,761
Discovery Miles 27 610
|