Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > General theory of computing
Ambient Intelligence (AmI) is an emerging paradigm for knowledge discovery, which originally emerged as a design language for invisible computing and smart environments. Since its introduction in the late 1990's, AmI has matured and evolved, having inspired fields including computer science, interaction design, mobile computing, and cognitive science. Ubiquitous Developments in Ambient Computing and Intelligence: Human-Centered Applications provides a comprehensive collection of knowledge in cutting-edge research in fields as diverse as distributed computing, human computer interaction, ubiquitous computing, embedded systems, and other interdisciplinary areas which all contribute to AmI. Predicting the technologies that will shape our ever changing world is difficult however, in this book it is discussed that Ambient Intelligent technology will develop considerably in the future.
As computers have infiltrated virtually every facet of our lives, so has computer science influenced nearly every academic subject in science, engineering, medicine, social science, the arts and humanities. Michael Knee offers a selective guide to the major resources and tools central to the entire industry. A discussion of three commonly used subject classification systems precedes an annotated bibliography of over 500 items. As computers have infiltrated virtually every facet of our lives, so has computer science influenced nearly every academic subject in science, engineering, medicine, social science, the arts and humanities. Michael Knee offers a selective guide to the major resources and tools central to the computer industry: teaching institutions, research institutes and laboratories, manufacturers, standardization organizations, professional associations and societies, and publishers. He begins with a discussion of the three subject classification systems most commonly used to describe, index, and manage computer science information: the Association for Computing Machinery, Inspec, and the Library of Congress. An annotated bibliography of over 500 items follows, grouped by material type, and featuring a mix of classic works and current sources.
This book presents physical-layer security as a promising paradigm for achieving the information-theoretic secrecy required for wireless networks. It explains how wireless networks are extremely vulnerable to eavesdropping attacks and discusses a range of security techniques including information-theoretic security, artificial noise aided security, security-oriented beamforming, and diversity assisted security approaches. It also provides an overview of the cooperative relaying methods for wireless networks such as orthogonal relaying, non-orthogonal relaying, and relay selection.Chapters explore the relay-selection designs for improving wireless secrecy against eavesdropping in time-varying fading environments and a joint relay and jammer selection for wireless physical-layer security, where a relay is used to assist the transmission from the source to destination and a friendly jammer is employed to transmit an artificial noise for confusing the eavesdropper. Additionally, the security-reliability tradeoff (SRT) is mathematically characterized for wireless communications and two main relay-selection schemes, the single-relay and multi-relay selection, are devised for the wireless SRT improvement. In the single-relay selection, only the single best relay is chosen for assisting the wireless transmission, while the multi-relay selection invokes multiple relays for simultaneously forwarding the source transmission to the destination.Physical-Layer Security for Cooperative Relay Networks is designed for researchers and professionals working with networking or wireless security. Advanced-level students interested in networks, wireless, or privacy will also find this book a useful resource.
This book opens the door to a new interesting and ambitious world of reversible and quantum computing research. It presents the state of the art required to travel around that world safely. Top world universities, companies and government institutions are in a race of developing new methodologies, algorithms and circuits on reversible logic, quantum logic, reversible and quantum computing and nano-technologies. In this book, twelve reversible logic synthesis methodologies are presented for the first time in a single literature with some new proposals. Also, the sequential reversible logic circuitries are discussed for the first time in a book. Reversible logic plays an important role in quantum computing. Any progress in the domain of reversible logic can be directly applied to quantum logic. One of the goals of this book is to show the application of reversible logic in quantum computing. A new implementation of wavelet and multiwavelet transforms using quantum computing is performed for this purpose. Researchers in academia or industry and graduate students, who work in logic synthesis, quantum computing, nano-technology, and low power VLSI circuit design, will be interested in this book.
This Festschrift is in honor of Marilyn Wolf, on the occasion of her 60th birthday. Prof. Wolf is a renowned researcher and educator in Electrical and Computer Engineering, who has made pioneering contributions in all of the major areas in Embedded, Cyber-Physical, and Internet of Things (IoT) Systems. This book provides a timely collection of contributions that cover important topics related to Smart Cameras, Hardware/Software Co-Design, and Multimedia applications. Embedded systems are everywhere; cyber-physical systems enable monitoring and control of complex physical processes with computers; and IoT technology is of increasing relevance in major application areas, including factory automation, and smart cities. Smart cameras and multimedia technologies introduce novel opportunities and challenges in embedded, cyber-physical and IoT applications. Advanced hardware/software co-design methodologies provide valuable concepts and tools for addressing these challenges. The diverse topics of the chapters in this Festschrift help to reflect the great breadth and depth of Marilyn Wolf's contributions in research and education. The chapters have been written by some of Marilyn's closest collaborators and colleagues.
This book offers a self-study program on how mathematics, computer science and science can be profitably and seamlessly intertwined. This book focuses on two variable ODE models, both linear and nonlinear, and highlights theoretical and computational tools using MATLAB to explain their solutions. It also shows how to solve cable models using separation of variables and the Fourier Series.
This book provides a critical examination of how the choice of what to believe is represented in the standard model of belief change. In particular the use of possible worlds and infinite remainders as objects of choice is critically examined. Descriptors are introduced as a versatile tool for expressing the success conditions of belief change, addressing both local and global descriptor revision. The book presents dynamic descriptors such as Ramsey descriptors that convey how an agent's beliefs tend to be changed in response to different inputs. It also explores sentential revision and demonstrates how local and global operations of revision by a sentence can be derived as a special case of descriptor revision. Lastly, the book examines revocation, a generalization of contraction in which a specified sentence is removed in a process that may possibly also involve the addition of some new information to the belief set.
This book offers an introduction to applications prompted by tensor analysis, especially by the spectral tensor theory developed in recent years. It covers applications of tensor eigenvalues in multilinear systems, exponential data fitting, tensor complementarity problems, and tensor eigenvalue complementarity problems. It also addresses higher-order diffusion tensor imaging, third-order symmetric and traceless tensors in liquid crystals, piezoelectric tensors, strong ellipticity for elasticity tensors, and higher-order tensors in quantum physics. This book is a valuable reference resource for researchers and graduate students who are interested in applications of tensor eigenvalues.
FORTRAN Programming success in a day:Beginners guide to fast, easy and efficient learning of FORTRAN programming What is Fortran? How can you become proficient in Fortran Programming? The perfect starter book for anyone trying to learn this specific type of programming! Want to learn quick data types? Need examples on data types How about variables? Or needing to know how to manipulate variables with Fortran Programming? Every type of Intrinsic Functions in Fortran right here! Finally lets dive into Conditional statements and put in terms you or anyone with no background in programming can understand!
This book presents the latest findings and ongoing research in the field of environmental informatics. It addresses a wide range of cross-cutting activities, such as efficient computing, virtual reality, disruption management, big data, open science and the internet of things, and showcases how these green information & communication technologies (ICT) can be used to effectively address environmental and societal challenges. Presenting a selection of extended contributions to the 32nd edition of the International Conference EnviroInfo 2018, at the Leibniz Supercomputing Centre in Garching near Munich, it is essential reading for anyone looking to expand their expertise in the area.
Collected together in this book are ten state-of-the-art expository articles on the most important topics in optimization, written by leading experts in the field. The book therefore provides a primary reference for those performing research in some area of optimization or for those who have some basic knowledge of optimization techniques but wish to learn the most up-to-date and efficient algorithms for particular classes of problems. The first sections of each chapter are expository and therefore accessible to master's level graduate students. However, the chapters also contain advanced material on current topics of interest to researchers. For instance there are chapters which describe the polynomial-time linear programming algorithms of Khachian and Karmarkar and the techniques used to solve combinatorial and integer programming problems, an order of magnitude larger than was possible just a few years ago. Overall a comprehensive yet lively and up-to-date discussion of the state-of-the-art in optimization is presented in this book.
Useful to healthcare providers, severity indices conclude which patients are most at risk for infection as well as the intensity of illness while in the hospital. ""Text Mining Techniques for Healthcare Provider Quality Determination: Methods for Rank Comparisons"" discusses the general practice of defining a patient severity index for risk adjustments and comparison of patient outcomes to assess quality factors. This ""Premier Reference Source"" examines the consequences of patient severity models and investigates the general assumptions required to perform standard severity adjustment.
This book investigates the coordinated power management of multi-tenant data centers that account for a large portion of the data center industry. The authors include discussion of their quick growth and their electricity consumption, which has huge economic and environmental impacts. This book covers the various coordinated management solutions in the existing literature focusing on efficiency, sustainability, and demand response aspects. First, the authors provide a background on the multi-tenant data center covering the stake holders, components, power infrastructure, and energy usage. Then, each power management mechanism is described in terms of motivation, problem formulation, challenges and solution.
In this book the editors have gathered a number of contributions by persons who have been working on problems of Cognitive Technology (CT). The present collection initiates explorations of the human mind via the technologies the mind produces. These explorations take as their point of departure the question What happens when humans produce new technologies? Two interdependent perspectives from which such a production can be approached are adopted: - How and why constructs that have their origins in human mental life are embodied in physical environments when people fabricate their habitat, even to the point of those constructs becoming that very habitat - How and why these fabricated habitats affect, and feed back into, human mental life. The aim of the CT research programme is to determine, in general, which technologies, and in particular, which interactive computer-based technologies, are humane with respect to the cognitive development and evolutionary adaptation of their end users. But what does it really mean to be humane in a technological world? To shed light on this central issue other pertinent questions are raised, e.g. - Why are human minds externalised, i.e., what purpose does the process of externalisation serve? - What can we learn about the human mind by studying how it externalises itself? - How does the use of externalised mental constructs (the objects we call 'tools') change people fundamentally? - To what extent does human interaction with technology serve as an amplification of human cognition, and to what extent does it lead to a atrophy of the human mind? The book calls for a reflection on what a tool is. Strong parallels between CT andenvironmentalism are drawn: both are seen as trends having originated in our need to understand how we manipulate, by means of the tools we have created, our natural habitat consisting of, on the one hand, the cognitive environment which generates thought and determines action, and on the other hand, the physical environment in which thought and action are realised. Both trends endeavour to protect the human habitat from the unwanted or uncontrolled impact of technology, and are ultimately concerned with the ethics and aesthetics of tool design and tool use. Among the topics selected by the contributors to the book, the following themes emerge (the list is not exhaustive): using technology to empower the cognitively impaired; the ethics versus aesthetics of technology; the externalisation of emotive and affective life and its special dialectic ('mirror') effects; creativity enhancement: cognitive space, problem tractability; externalisation of sensory life and mental imagery; the engineering and modelling aspects of externalised life; externalised communication channels and inner dialogue; externalised learning protocols; relevance analysis as a theoretical framework for cognitive technology.
The best selling 'Algorithmics' presents the most important, concepts, methods and results that are fundamental to the science of computing. It starts by introducing the basic ideas of algorithms, including their structures and methods of data manipulation. It then goes on to demonstrate how to design accurate and efficient algorithms, and discusses their inherent limitations. As the author himself says in the preface to the book; 'This book attempts to present a readable account of some of the most important and basic topics of computer science, stressing the fundamental and robust nature of the science in a form that is virtually independent of the details of specific computers, languages and formalisms'.
This monograph addresses the state of the art of reduced order methods for modeling and computational reduction of complex parametrized systems, governed by ordinary and/or partial differential equations, with a special emphasis on real time computing techniques and applications in computational mechanics, bioengineering and computer graphics. Several topics are covered, including: design, optimization, and control theory in real-time with applications in engineering; data assimilation, geometry registration, and parameter estimation with special attention to real-time computing in biomedical engineering and computational physics; real-time visualization of physics-based simulations in computer science; the treatment of high-dimensional problems in state space, physical space, or parameter space; the interactions between different model reduction and dimensionality reduction approaches; the development of general error estimation frameworks which take into account both model and discretization effects. This book is primarily addressed to computational scientists interested in computational reduction techniques for large scale differential problems.
This book offers an in-depth insight into the general-purpose finite element program MSC Marc, which is distributed by MSC Software Corporation. It is a specialized program for nonlinear problems (implicit solver) which is common in academia and industry. The primary goal of this book is to provide a comprehensive introduction to a special feature of this software: the user can write user-subroutines in the programming language Fortran, which is the language of all classical finite element packages. This subroutine feature allows the user to replace certain modules of the core code and to implement new features such as constitutive laws or new elements. Thus, the functionality of commercial codes ('black box') can easily be extended by linking user written code to the main core of the program. This feature allows to take advantage of a commercial software package with the flexibility of a 'semi-open' code.
This volume presents some recent and principal developments related to computational intelligence and optimization methods in control. Theoretical aspects and practical applications of control engineering are covered by 14 self-contained contributions. Additional gems include the discussion of future directions and research perspectives designed to add to the reader's understanding of both the challenges faced in control engineering and the insights into the developing of new techniques. With the knowledge obtained, readers are encouraged to determine the appropriate control method for specific applications.
Das Buch behandelt Prinzipien und Methoden der Software-Entwicklung fA1/4r Kommunikationsnetze, basierend auf praktischen Erfahrungen aus einer Reihe von Software-Projekten. Die spezifischen Merkmale dieser Software sind parallele AblAufe, zeitkritisches Antwortverhalten, komplexe FunktionalitAt und sehr hohe QualitAtsanforderungen. Eine wesentliche Rolle bei der Beherrschung der Software-KomplexitAt spielt die Architektur. Sie stellt die Regeln und Methoden fA1/4r einen effektiven Systementwurf zur VerfA1/4gung, auf dem sich der gesamte Entwicklungsprozess abstA1/4tzen kann. Dazu gehArt eine vollstAndige Spezifikationsmethodik auf der Grundlage einer formalen Sprache, deren Semantik an den typischen Merkmalen von Kommunikations-Software ausgerichtet ist. Schwerpunkt der AusfA1/4hrungen ist die Anpassung der Software-Entwicklung an die steigenden Anforderungen bezA1/4glich FunktionalitAt, Marktorientierung, Kosten und Zeit.
This volume presents the latest advances and trends in stochastic models and related statistical procedures. Selected peer-reviewed contributions focus on statistical inference, quality control, change-point analysis and detection, empirical processes, time series analysis, survival analysis and reliability, statistics for stochastic processes, big data in technology and the sciences, statistical genetics, experiment design, and stochastic models in engineering. Stochastic models and related statistical procedures play an important part in furthering our understanding of the challenging problems currently arising in areas of application such as the natural sciences, information technology, engineering, image analysis, genetics, energy and finance, to name but a few. This collection arises from the 12th Workshop on Stochastic Models, Statistics and Their Applications, Wroclaw, Poland.
This book contains an edited selection of the papers accepted for presentation and discussion at the first International Symposium on Qualitative Research (ISQR2016), held in Porto, Portugal, July 12th-14th, 2016. The book and the symposium features the four main application fields Education, Health, Social Sciences and Engineering and Technology and seven main subjects: Rationale and Paradigms of Qualitative Research (theoretical studies, critical reflection about epistemological dimensions, ontological and axiological); Systematization of approaches with Qualitative Studies (literature review, integrating results, aggregation studies, meta -analysis, meta- analysis of qualitative meta- synthesis, meta- ethnography); Qualitative and Mixed Methods Research (emphasis in research processes that build on mixed methodologies but with priority to qualitative approaches); Data Analysis Types (content analysis , discourse analysis , thematic analysis , narrative analysis , etc.); Innovative processes of Qualitative Data Analysis (design analysis, articulation and triangulation of different sources of data - images, audio, video); Qualitative Research in Web Context (eResearch, virtual ethnography, interaction analysis , latent corpus on the internet, etc.); Qualitative Analysis with Support of Specific Software (usability studies, user experience, the impact of software on the quality of research.
This book primarily addresses Intelligent Information Systems (IIS) and the integration of artificial intelligence, intelligent systems and technologies, database technologies and information systems methodologies to create the next generation of information systems. It includes original and state-of-the-art research on theoretical and practical advances in IIS, system architectures, tools and techniques, as well as "success stories" in intelligent information systems. Intended as an interdisciplinary forum in which scientists and professionals could share their research results and report on new developments and advances in intelligent information systems, technologies and related areas - as well as their applications - , it offers a valuable resource for researchers and practitioners alike.
This book focuses on the development of three novel approaches to build up a framework for the frequency domain analysis and design of nonlinear systems. The concepts are derived from Volterra series representation of nonlinear systems which are described by nonlinear difference or differential equations. Occupying the middle ground between traditional linear approaches and more complex nonlinear system theories, the book will help readers to have a good start to analyse and exploit the nonlinearities. Analysis and Design of Nonlinear Systems in the Frequency Domain provides clear illustrations and examples at the beginning and the end of each chapter, respectively, making it of interest to both academics and practicing engineers. |
You may like...
A Journey from London to Genoa - Through…
Giuseppe Marco Antonio Baretti
Paperback
R499
Discovery Miles 4 990
Discovering Computers (c)2017
Mark Frydenberg, Misty Vermaat, …
Paperback
(3)
R966 Discovery Miles 9 660
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
|