![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
High-speed, power-efficient analog integrated circuits can be used as standalone devices or to interface modern digital signal processors and micro-controllers in various applications, including multimedia, communication, instrumentation, and control systems. New architectures and low device geometry of complementary metaloxidesemiconductor (CMOS) technologies have accelerated the movement toward system on a chip design, which merges analog circuits with digital, and radio-frequency components.
This volume of LNCSE is a collection of the papers from the proceedings of the third workshop on sparse grids and applications. Sparse grids are a popular approach for the numerical treatment of high-dimensional problems. Where classical numerical discretization schemes fail in more than three or four dimensions, sparse grids, in their different guises, are frequently the method of choice, be it spatially adaptive in the hierarchical basis or via the dimensionally adaptive combination technique. Demonstrating once again the importance of this numerical discretization scheme, the selected articles present recent advances on the numerical analysis of sparse grids as well as efficient data structures. The book also discusses a range of applications, including uncertainty quantification and plasma physics.
Pursuing an interdisciplinary approach, this book offers detailed insights into the empirical relationships between overall social key figures of states and cultures in the fields of information and communication technology (ICT) (digital divide/inequality), the economy, education and religion. Its goal is to bridge the 'cultural gap' between computer scientists, engineers, economists, social and political scientists by providing a mutual understanding of the essential challenges posed and opportunities offered by a global information and knowledge society. In a sense, the historically unprecedented technical advances in the field of ICT are shaping humanity at different levels and forming a hybrid (intelligent) human-technology system, a so-called global superorganism. The main innovation is the combined study of digitization and globalization in the context of growing social inequalities, collapse, and sustainable development, and how a convergence towards a kind of global culture could take place. Accordingly, the book discusses the spread of ICT, Internet Governance, the balance between the central concentration of power and the extent of decentralized power distribution, the inclusion or exclusion of people and states in global communication processes, and the capacity for global empathy or culture.
Soft computing, as an engineering science, and statistics, as a
classical branch of mathematics, emphasize different aspects of
data analysis.
With more restrictions upon animal experimentations, pharmaceutical industries are currently focusing on a new generation of experiments and technologies that are considerably more efficient and less controversial. The integration of computational and experimental strategies has led to the identification and development of promising compounds. Computer Applications in Drug Discovery and Development is a pivotal reference source that provides innovative research on the application of computers for discovering and designing new drugs in modern molecular biology and medicinal chemistry. While highlighting topics such as chemical structure databases and dataset utilization, this publication delves into the current panorama of drug discovery, where high drug failure rates are a major concern and properly designed virtual screening strategies can be a time-saving, cost-effective, and productive alternative. This book is ideally designed for chemical engineers, pharmacists, molecular biologists, students, researchers, and academicians seeking current research on the unexplored avenues and future perspectives of drug design.
This accessible compendium examines a collection of significant technology firms that have helped to shape the field of computing and its impact on society. Each company is introduced with a brief account of its history, followed by a concise account of its key contributions. The selection covers a diverse range of historical and contemporary organizations from pioneers of e-commerce to influential social media companies. Features: presents information on early computer manufacturers; reviews important mainframe and minicomputer companies; examines the contributions to the field of semiconductors made by certain companies; describes companies that have been active in developing home and personal computers; surveys notable research centers; discusses the impact of telecommunications companies and those involved in the area of enterprise software and business computing; considers the achievements of e-commerce companies; provides a review of social media companies.
To solve performance problems in modern computing infrastructures, often comprising thousands of servers running hundreds of applications, spanning multiple tiers, you need tools that go beyond mere reporting. You need tools that enable performance analysis of application workflow across the entire enterprise. That's what PDQ (Pretty Damn Quick) provides. PDQ is an open-source performance analyzer based on the paradigm of queues. Queues are ubiquitous in every computing environment as buffers, and since any application architecture can be represented as a circuit of queueing delays, PDQ is a natural fit for analyzing system performance. Building on the success of the first edition, this considerably expanded second edition now comprises four parts. Part I contains the foundational concepts, as well as a new first chapter that explains the central role of queues in successful performance analysis. Part II provides the basics of queueing theory in a highly intelligible style for the non-mathematician; little more than high-school algebra being required. Part III presents many practical examples of how PDQ can be applied. The PDQ manual has been relegated to an appendix in Part IV, along with solutions to the exercises contained in each chapter. Throughout, the Perl code listings have been newly formatted to improve readability. The PDQ code and updates to the PDQ manual are available from the author's web site at www.perfdynamics.com
Studying narratives is often the best way to gain a good understanding of how various aspects of human information are organized and integrated-the narrator employs specific informational methods to build the whole structure of a narrative through combining temporally constructed events in light of an array of relationships to the narratee and these methods reveal the interaction of the rational and the sensitive aspects of human information. Computational and Cognitive Approaches to Narratology discusses issues of narrative-related information and communication technologies, cognitive mechanism and analyses, and theoretical perspectives on narratives and the story generation process. Focusing on emerging research as well as applications in a variety of fields including marketing, philosophy, psychology, art, and literature, this timely publication is an essential reference source for researchers, professionals, and graduate students in various information technology, cognitive studies, design, and creative fields.
Weighted finite automata are classical nondeterministic finite automata in which the transitions carry weights. These weights may model, for example, the cost involved when executing a transition, the resources or time needed for this, or the probability or reliability of its successful execution. Weights can also be added to classical automata with infinite state sets like pushdown automata, and this extension constitutes the general concept of weighted automata. Since their introduction in the 1960s they have stimulated research in related areas of theoretical computer science, including formal language theory, algebra, logic, and discrete structures. Moreover, weighted automata and weighted context-free grammars have found application in natural-language processing, speech recognition, and digital image compression. This book covers all the main aspects of weighted automata and formal power series methods, ranging from theory to applications. The contributors are the leading experts in their respective areas, and each chapter presents a detailed survey of the state of the art and pointers to future research. The chapters in Part I cover the foundations of the theory of weighted automata, specifically addressing semirings, power series, and fixed point theory. Part II investigates different concepts of weighted recognizability. Part III examines alternative types of weighted automata and various discrete structures other than words. Finally, Part IV deals with applications of weighted automata, including digital image compression, fuzzy languages, model checking, and natural-language processing. Computer scientists and mathematicians will find this book an excellent survey and reference volume, and it will also be a valuable resource for students exploring this exciting research area.
As information availability pervades every aspect of life, new trends, technologies, and enhancements must be carefully researched, evaluated and implemented in order to fabricate successful computing applications. Designing Solutions-Based Ubiquitous and Pervasive Computing: New Issues and Trends examines current practices and challenges faced by designers of Ubiquitous and Pervasive Computing projects. Analyzing theoretical assumptions, empirical research, practical implementations, and case studies, this book intends to present a comprehensive yet concise overview of the recent progress in ubiquitous and pervasive computing applications, as well as discuss new methodologies, technologies and approaches currently practiced. Truly interdisciplinary, this publication serves as a valuable reference for researchers in the fields of electrical engineering and computer science, students, educators, and industrial trainers interested in network sensors, embedded systems, distributed systems, computer networks, and computer science.
Reaction-diffusion and excitable media are amongst most intriguing substrates. Despite apparent simplicity of the physical processes involved the media exhibit a wide range of amazing patterns: from target and spiral waves to travelling localisations and stationary breathing patterns. These media are at the heart of most natural processes, including morphogenesis of living beings, geological formations, nervous and muscular activity, and socio-economic developments. This book explores a minimalist paradigm of studying reaction-diffusion and excitable media using locally-connected networks of finite-state machines: cellular automata and automata on proximity graphs. Cellular automata are marvellous objects per se because they show us how to generate and manage complexity using very simple rules of dynamical transitions. When combined with the reaction-diffusion paradigm the cellular automata become an essential user-friendly tool for modelling natural systems and designing future and emergent computing architectures. The book brings together hot topics of non-linear sciences, complexity, and future and emergent computing. It shows how to discover propagating localisation and perform computation with them in very simple two-dimensional automaton models. Paradigms, models and implementations presented in the book strengthen the theoretical foundations in the area for future and emergent computing and lay key stones towards physical embodied information processing systems.
Social insects such as ants and termites can be viewed as powerful problem-solving systems with sophisticated collective intelligence. Composed of simple interacting agents, this intelligence lies in the networks of interactions among individuals and between individuals and the environment. Social insects are also a powerful metaphor for artificial intelligence. The problems they solve - for instance, finding food, dividing labor among nestmates, building nests, and responding to external challenges - have important counterparts in engineering and computer science. This book provides a detailed look at models of social insect behaviour and how these can be applied in the design of complex systems. It draws upon a complementary blend of biology and computer science, including artificial intelligence, robotics, operations research, informationdisplay, and computer graphics. The book should appeal to a broadly interdisciplinary audience of modellers, engineers, neuroscientists, and computer scientists, as well as some biologists and ecologists.
"Discrete-Time Linear Systems: Theory and Design with Applications "combines system theory and design in order to show the importance of system theory and its role in system design. The book focuses on system theory (including optimal state feedback and optimal state estimation) and system design (with applications to feedback control systems and wireless transceivers, plus system identification and channel estimation).
Towards Solid-State Quantum Repeaters: Ultrafast, Coherent Optical Control and Spin-Photon Entanglement in Charged InAs Quantum Dots summarizes several state-of-the-art coherent spin manipulation experiments in III-V quantum dots. Both high-fidelity optical manipulation, decoherence due to nuclear spins and the spin coherence extraction are discussed, as is the generation of entanglement between a single spin qubit and a photonic qubit. The experimental results are analyzed and discussed in the context of future quantum technologies, such as quantum repeaters. Single spins in optically active semiconductor host materials have emerged as leading candidates for quantum information processing (QIP). The quantum nature of the spin allows for encoding of stationary, memory quantum bits (qubits), and the relatively weak interaction with the host material preserves the spin coherence. On the other hand, optically active host materials permit direct interfacing with light, which can be used for all-optical qubit manipulation, and for efficiently mapping matter qubits into photonic qubits that are suited for long-distance quantum communication.
Computer-Aided Innovation (CAI) is emerging as a strategic domain of research and application to support enterprises throughout the overall innovation process. The 5.4 Working Group of IFIP aims at defining the scientific foundation of Computer Aided Innovation systems and at identifying state of the art and trends of CAI tools and methods. These Proceedings derive from the second Topical Session on Computer- Aided Innovation organized within the 20th World Computer Congress of IFIP. The goal of the Topical Session is to provide a survey of existing technologies and research activities in the field and to identify opportunities of integration of CAI with other PLM systems. According to the heterogeneous needs of innovation-related activities, the papers published in this volume are characterized by multidisciplinary contents and complementary perspectives and scopes. Such a richness of topics and disciplines will certainly contribute to the promotion of fruitful new collaborations and synergies within the IFIP community. Gaetano Cascini th Florence, April 30 20 08 CAI Topical Session Organization The IFIP Topical Session on Computer-Aided Innovation (CAI) is a co-located conference organized under the auspices of the IFIP World Computer Congress (WCC) 2008 in Milano, Italy Gaetano Cascini CAI Program Committee Chair [email protected]
For half a century at least, I.T. teams have focused on solving business problems through computer technology - and largely ignoring the human element in their interactions with end users. In his new book I.T. IN CRISIS: A NEW BUSINESS MODEL, consultant L. Paul Ouellette shows how to bring the I.T. team into the twenty-first century. Organizations that employ I.T. professionals are facing a new economic landscape - one where closer, more engaged relationships with internal and external customers are not merely nice if you can get it, but essential for organizational survival. I.T.'s old business as usual approach - and let the relationship thing take care of itself - is, Ouellette warns, now a recipe for disaster. I.T.'s challenge is to adapt to the customer-focused operational realities of the twenty-first century. Teams that meet this challenge will thrive, and will create extraordinary opportunities for themselves and their organizations. Teams that don't, Ouellette believes, will be marginalized or phased out. How do we make this (long-overdue) transition? By upgrading the I.T. Professional's skill sets - and moving from the back room to the forefront of the business, the place where person-to-person connections with customers as human beings take place. In I.T. IN CRISIS: A NEW BUSINESS MODEL, Ouellette offers proven, real-world strategies for I.T. teams to forge closer bonds with their end users. He shows I.T. professionals how to change the way their customers think about I.T., how to improve I.T.'s standing within their own organizations, and how to enhance their own careers -Paul offers the 1 tool to turn negative relations into a positive one. Methods for successfully conducting the 3 main points of your clients' interactions, learn what clients really want from I.T. and the 5 steps to building your sustainable service strategy. Building very specific empathy, listening skills, rapport-building, and overall relationship management capacities. Ouellette also includes the case studies and action forms that will help I.T. teams to execute on the book's core concept. Today's business environment is highly competitive. In order to survive, organizations must create new business models that focus "like a laser beam:" on the customer. For those who work in Information Technology (I.T.) customer relations is no longer a "nice to have skill, but rather a "must have:" skill. The average professional Information Technologist is lacking skills in this area - and thus I.T. faces a crisis. For the first time since the introduction of computer technology to the world of business, I.T. funding has been reduced, and investments going into computer business technology are declining. I.T. is no longer seen as the savior of a company's bottom line. This state of affairs actually represents a new opportunity for I.T. If we make a conscious decision to conduct business differently, upgrade our skills, and focus on the customer - we can get the credit, attention, and recognition we deserve. Computer technology solutions are but one part of what we offer. In the twenty-first century, we need to play a much broader role ... build stronger relationships with the people we serve ... and become an irreplaceable part of the client's business solution. Addressing the problems and offering corrective strategies facing today's I.T. professional are the sole purposes of this book. Once we do this, we will not only succeed, we will thrive I.T. IN CRISIS: A NEW BUSINESS MODEL strategizes how to make this transition.
MUSIC 2013 will be the most comprehensive text focused on the various aspects of Mobile, Ubiquitous and Intelligent computing. MUSIC 2013 provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of intelligent technologies in mobile and ubiquitous computing environment. MUSIC 2013 is the next edition of the 3rd International Conference on Mobile, Ubiquitous, and Intelligent Computing (MUSIC-12, Vancouver, Canada, 2012) which was the next event in a series of highly successful International Workshop on Multimedia, Communication and Convergence technologies MCC-11 (Crete, Greece, June 2011), MCC-10 (Cebu, Philippines, August 2010).
These are the proceedings of the 20th international conference on domain decomposition methods in science and engineering. Domain decomposition methods are iterative methods for solving the often very large linearor nonlinear systems of algebraic equations that arise when various problems in continuum mechanics are discretized using finite elements. They are designed for massively parallel computers and take the memory hierarchy of such systems in mind. This is essential for approaching peak floating point performance. There is an increasingly well developed theory whichis having a direct impact on the development and improvements of these algorithms.
What does it mean to live and work inside the information and communication technology revolution? The nature and significance of newly emerging patterns of social and technical interaction as digital technologies become more pervasive in the knowledge economy are the focus of this book. The places and spaces where digital technolgoies are in use are examined to show why such use may or may not be associated with improvements in society. Studies of on- and off-line interactions between individuals and of collective attempts to govern and manage the new technologies show that the communication revolution is essentially about people, social organization, adaptation, and control, not just technologies This book contains original empirical studies conducted within a programme of research in the Information, Networks and Knowledge (INK) research centre at SPRU, University of Sussex.
|
You may like...
Object Management in Distributed…
Wujuan Lin, Bharadwaj Veeravalli
Hardcover
R2,748
Discovery Miles 27 480
Data Dissemination and Query in Mobile…
Jiming Chen, Jialu Fan, …
Paperback
R1,067
Discovery Miles 10 670
Enterprise Big Data Engineering…
Martin Atzmueller, Samia Oussena, …
Hardcover
R5,155
Discovery Miles 51 550
Oracle Database 10g Data Warehouseing
Lilian Hobbs, Susan Hillson, …
Paperback
R1,827
Discovery Miles 18 270
Exam Ref 70-767 Implementing a SQL Data…
Jose Chinchilla, Raj Uchhana
Paperback
|