![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > General theory of computing
Reaction-diffusion and excitable media are amongst most intriguing substrates. Despite apparent simplicity of the physical processes involved the media exhibit a wide range of amazing patterns: from target and spiral waves to travelling localisations and stationary breathing patterns. These media are at the heart of most natural processes, including morphogenesis of living beings, geological formations, nervous and muscular activity, and socio-economic developments. This book explores a minimalist paradigm of studying reaction-diffusion and excitable media using locally-connected networks of finite-state machines: cellular automata and automata on proximity graphs. Cellular automata are marvellous objects per se because they show us how to generate and manage complexity using very simple rules of dynamical transitions. When combined with the reaction-diffusion paradigm the cellular automata become an essential user-friendly tool for modelling natural systems and designing future and emergent computing architectures. The book brings together hot topics of non-linear sciences, complexity, and future and emergent computing. It shows how to discover propagating localisation and perform computation with them in very simple two-dimensional automaton models. Paradigms, models and implementations presented in the book strengthen the theoretical foundations in the area for future and emergent computing and lay key stones towards physical embodied information processing systems.
This unique text/reference presents a fresh look at nonlinear processing through nonlinear eigenvalue analysis, highlighting how one-homogeneous convex functionals can induce nonlinear operators that can be analyzed within an eigenvalue framework. The text opens with an introduction to the mathematical background, together with a summary of classical variational algorithms for vision. This is followed by a focus on the foundations and applications of the new multi-scale representation based on non-linear eigenproblems. The book then concludes with a discussion of new numerical techniques for finding nonlinear eigenfunctions, and promising research directions beyond the convex case. Topics and features: introduces the classical Fourier transform and its associated operator and energy, and asks how these concepts can be generalized in the nonlinear case; reviews the basic mathematical notion, briefly outlining the use of variational and flow-based methods to solve image-processing and computer vision algorithms; describes the properties of the total variation (TV) functional, and how the concept of nonlinear eigenfunctions relate to convex functionals; provides a spectral framework for one-homogeneous functionals, and applies this framework for denoising, texture processing and image fusion; proposes novel ways to solve the nonlinear eigenvalue problem using special flows that converge to eigenfunctions; examines graph-based and nonlocal methods, for which a TV eigenvalue analysis gives rise to strong segmentation, clustering and classification algorithms; presents an approach to generalizing the nonlinear spectral concept beyond the convex case, based on pixel decay analysis; discusses relations to other branches of image processing, such as wavelets and dictionary based methods. This original work offers fascinating new insights into established signal processing techniques, integrating deep mathematical concepts from a range of different fields, which will be of great interest to all researchers involved with image processing and computer vision applications, as well as computations for more general scientific problems.
"Discrete-Time Linear Systems: Theory and Design with Applications "combines system theory and design in order to show the importance of system theory and its role in system design. The book focuses on system theory (including optimal state feedback and optimal state estimation) and system design (with applications to feedback control systems and wireless transceivers, plus system identification and channel estimation).
Towards Solid-State Quantum Repeaters: Ultrafast, Coherent Optical Control and Spin-Photon Entanglement in Charged InAs Quantum Dots summarizes several state-of-the-art coherent spin manipulation experiments in III-V quantum dots. Both high-fidelity optical manipulation, decoherence due to nuclear spins and the spin coherence extraction are discussed, as is the generation of entanglement between a single spin qubit and a photonic qubit. The experimental results are analyzed and discussed in the context of future quantum technologies, such as quantum repeaters. Single spins in optically active semiconductor host materials have emerged as leading candidates for quantum information processing (QIP). The quantum nature of the spin allows for encoding of stationary, memory quantum bits (qubits), and the relatively weak interaction with the host material preserves the spin coherence. On the other hand, optically active host materials permit direct interfacing with light, which can be used for all-optical qubit manipulation, and for efficiently mapping matter qubits into photonic qubits that are suited for long-distance quantum communication.
This volume of LNCSE is a collection of the papers from the proceedings of the third workshop on sparse grids and applications. Sparse grids are a popular approach for the numerical treatment of high-dimensional problems. Where classical numerical discretization schemes fail in more than three or four dimensions, sparse grids, in their different guises, are frequently the method of choice, be it spatially adaptive in the hierarchical basis or via the dimensionally adaptive combination technique. Demonstrating once again the importance of this numerical discretization scheme, the selected articles present recent advances on the numerical analysis of sparse grids as well as efficient data structures. The book also discusses a range of applications, including uncertainty quantification and plasma physics.
Computer-Aided Innovation (CAI) is emerging as a strategic domain of research and application to support enterprises throughout the overall innovation process. The 5.4 Working Group of IFIP aims at defining the scientific foundation of Computer Aided Innovation systems and at identifying state of the art and trends of CAI tools and methods. These Proceedings derive from the second Topical Session on Computer- Aided Innovation organized within the 20th World Computer Congress of IFIP. The goal of the Topical Session is to provide a survey of existing technologies and research activities in the field and to identify opportunities of integration of CAI with other PLM systems. According to the heterogeneous needs of innovation-related activities, the papers published in this volume are characterized by multidisciplinary contents and complementary perspectives and scopes. Such a richness of topics and disciplines will certainly contribute to the promotion of fruitful new collaborations and synergies within the IFIP community. Gaetano Cascini th Florence, April 30 20 08 CAI Topical Session Organization The IFIP Topical Session on Computer-Aided Innovation (CAI) is a co-located conference organized under the auspices of the IFIP World Computer Congress (WCC) 2008 in Milano, Italy Gaetano Cascini CAI Program Committee Chair [email protected]
Meshfree methods are a modern alternative to classical mesh-based discretization techniques such as finite differences or finite element methods. Especially in a time-dependent setting or in the treatment of problems with strongly singular solutions their independence of a mesh makes these methods highly attractive. This volume collects selected papers presented at the Sixth International Workshop on Meshfree Methods held in Bonn, Germany in October 2011. They address various aspects of this very active research field and cover topics from applied mathematics, physics and engineering.
For half a century at least, I.T. teams have focused on solving business problems through computer technology - and largely ignoring the human element in their interactions with end users. In his new book I.T. IN CRISIS: A NEW BUSINESS MODEL, consultant L. Paul Ouellette shows how to bring the I.T. team into the twenty-first century. Organizations that employ I.T. professionals are facing a new economic landscape - one where closer, more engaged relationships with internal and external customers are not merely nice if you can get it, but essential for organizational survival. I.T.'s old business as usual approach - and let the relationship thing take care of itself - is, Ouellette warns, now a recipe for disaster. I.T.'s challenge is to adapt to the customer-focused operational realities of the twenty-first century. Teams that meet this challenge will thrive, and will create extraordinary opportunities for themselves and their organizations. Teams that don't, Ouellette believes, will be marginalized or phased out. How do we make this (long-overdue) transition? By upgrading the I.T. Professional's skill sets - and moving from the back room to the forefront of the business, the place where person-to-person connections with customers as human beings take place. In I.T. IN CRISIS: A NEW BUSINESS MODEL, Ouellette offers proven, real-world strategies for I.T. teams to forge closer bonds with their end users. He shows I.T. professionals how to change the way their customers think about I.T., how to improve I.T.'s standing within their own organizations, and how to enhance their own careers -Paul offers the 1 tool to turn negative relations into a positive one. Methods for successfully conducting the 3 main points of your clients' interactions, learn what clients really want from I.T. and the 5 steps to building your sustainable service strategy. Building very specific empathy, listening skills, rapport-building, and overall relationship management capacities. Ouellette also includes the case studies and action forms that will help I.T. teams to execute on the book's core concept. Today's business environment is highly competitive. In order to survive, organizations must create new business models that focus "like a laser beam:" on the customer. For those who work in Information Technology (I.T.) customer relations is no longer a "nice to have skill, but rather a "must have:" skill. The average professional Information Technologist is lacking skills in this area - and thus I.T. faces a crisis. For the first time since the introduction of computer technology to the world of business, I.T. funding has been reduced, and investments going into computer business technology are declining. I.T. is no longer seen as the savior of a company's bottom line. This state of affairs actually represents a new opportunity for I.T. If we make a conscious decision to conduct business differently, upgrade our skills, and focus on the customer - we can get the credit, attention, and recognition we deserve. Computer technology solutions are but one part of what we offer. In the twenty-first century, we need to play a much broader role ... build stronger relationships with the people we serve ... and become an irreplaceable part of the client's business solution. Addressing the problems and offering corrective strategies facing today's I.T. professional are the sole purposes of this book. Once we do this, we will not only succeed, we will thrive I.T. IN CRISIS: A NEW BUSINESS MODEL strategizes how to make this transition.
MUSIC 2013 will be the most comprehensive text focused on the various aspects of Mobile, Ubiquitous and Intelligent computing. MUSIC 2013 provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of intelligent technologies in mobile and ubiquitous computing environment. MUSIC 2013 is the next edition of the 3rd International Conference on Mobile, Ubiquitous, and Intelligent Computing (MUSIC-12, Vancouver, Canada, 2012) which was the next event in a series of highly successful International Workshop on Multimedia, Communication and Convergence technologies MCC-11 (Crete, Greece, June 2011), MCC-10 (Cebu, Philippines, August 2010).
What does it mean to live and work inside the information and communication technology revolution? The nature and significance of newly emerging patterns of social and technical interaction as digital technologies become more pervasive in the knowledge economy are the focus of this book. The places and spaces where digital technolgoies are in use are examined to show why such use may or may not be associated with improvements in society. Studies of on- and off-line interactions between individuals and of collective attempts to govern and manage the new technologies show that the communication revolution is essentially about people, social organization, adaptation, and control, not just technologies This book contains original empirical studies conducted within a programme of research in the Information, Networks and Knowledge (INK) research centre at SPRU, University of Sussex.
These proceedings contain the papers selected for presentation at the 23rd Inter- tional Information Security Conference (SEC 2008), co-located with IFIP World Computer Congress (WCC 2008), September 8-10, 2008 in Milan, Italy. In - sponse to the call for papers, 143 papers were submitted to the conference. All - pers were evaluated on the basis of their signi?cance, novelty, and technical quality, and reviewed by at least three members of the program committee. Reviewing was blind meaning that the authors were not told which committee members reviewed which papers. The program committee meeting was held electronically, holding - tensive discussion over a period of three weeks. Of the papers submitted, 42 full papers and 11 short papers were selected for presentation at the conference. A conference like this just does not happen; it depends on the volunteer efforts of a host of individuals. There is a long list of people who volunteered their time and energy to put together the conference and who deserve acknowledgment. We thank all members of the program committee and the external reviewers for their hard work in the paper evaluation. Due to the large number of submissions, p- gram committee members were required to complete their reviews in a short time frame. We are especially thankful to them for the commitment they showed with their active participation in the electronic discussion
This book introduces readers to genetic algorithms (GAs) with an emphasis on making the concepts, algorithms, and applications discussed as easy to understand as possible. Further, it avoids a great deal of formalisms and thus opens the subject to a broader audience in comparison to manuscripts overloaded by notations and equations. The book is divided into three parts, the first of which provides an introduction to GAs, starting with basic concepts like evolutionary operators and continuing with an overview of strategies for tuning and controlling parameters. In turn, the second part focuses on solution space variants like multimodal, constrained, and multi-objective solution spaces. Lastly, the third part briefly introduces theoretical tools for GAs, the intersections and hybridizations with machine learning, and highlights selected promising applications.
From the reviews of the previous editions ..".. The book is a first class textbook and seems to be indispensable for everybody who has to teach combinatorial optimization. It is very helpful for students, teachers, and researchers in this area. The author finds a striking synthesis of nice and interesting mathematical results and practical applications. ... the author pays much attention to the inclusion of well-chosen exercises. The reader does not remain helpless; solutions or at least hints are given in the appendix. Except for some small basic mathematical and algorithmic knowledge the book is self-contained. ..." K.Engel, Mathematical Reviews 2002 The substantial development effort of this text, involving multiple editions and trailing in the context of various workshops, university courses and seminar series, clearly shows through in this new edition with its clear writing, good organisation, comprehensive coverage of essential theory, and well-chosen applications. The proofs of important results and the representation of key algorithms in a Pascal-like notation allow this book to be used in a high-level undergraduate or low-level graduate course on graph theory, combinatorial optimization or computer science algorithms. The well-worked solutions to exercises are a real bonus for self study by students. The book is highly recommended. P .B. Gibbons, Zentralblatt fur Mathematik 2005 Once again, the new edition has been thoroughly revised. In particular, some further material has been added: more on NP-completeness (especially on dominating sets), a section on the Gallai-Edmonds structure theory for matchings, and about a dozen additional exercises as always, with solutions. Moreover, the section on the 1-factor theorem has been completely rewritten: it now presents a short direct proof for the more general Berge-Tutte formula. Several recent research developments are discussed and quite a few references have been added."
It is widely acknowledged that a common knowledge base for European research is necessary. Research repositories are an important innovation to the scientific information infrastructure. In 2006, digital repositories in the 27 countries of the European were surveyed, covering 114 repositories from 17 European countries. In follow-up, this book presents the results of the 2008 survey. It shows an increasing number of respondents, but also a further diversification in the character of a repository. Repositories may be institutional or thematically based, and as such non-institutional as well. 178 Institutional research repositories and 14 thematic and other noninstitutional repositories from 22 European countries took part actively. European practices should be harmonized and the development of state-of-the-art technology facilitated. Authors, institutes and information users are stakeholders in this process. In presenting a state-of the art of developments, this book is a valuable guide for them in developing their policy on research repositories without losing contact with others. The ongoing process of widespread and diversification of digital repositories puts urgency on coherent approach, as a basic feature of repositories is the retrievability of information that may be dispersed over many of them. Continued monitoring of developments will be necessary.
This book presents a comprehensive review of key distributed graph algorithms for computer network applications, with a particular emphasis on practical implementation. Topics and features: introduces a range of fundamental graph algorithms, covering spanning trees, graph traversal algorithms, routing algorithms, and self-stabilization; reviews graph-theoretical distributed approximation algorithms with applications in ad hoc wireless networks; describes in detail the implementation of each algorithm, with extensive use of supporting examples, and discusses their concrete network applications; examines key graph-theoretical algorithm concepts, such as dominating sets, and parameters for mobility and energy levels of nodes in wireless ad hoc networks, and provides a contemporary survey of each topic; presents a simple simulator, developed to run distributed algorithms; provides practical exercises at the end of each chapter.
Identity Based Encryption (IBE) is a type of public key encryption and has been intensely researched in the past decade. Identity-Based Encryption summarizes the available research for IBE and the main ideas that would enable users to pursue further work in this area. This book will also cover a brief background on Elliptic Curves and Pairings, security against chosen Cipher text Attacks, standards and more. Advanced-level students in computer science and mathematics who specialize in cryptology, and the general community of researchers in the area of cryptology and data security will find Identity-Based Encryption a useful book. Practitioners and engineers who work with real-world IBE schemes and need a proper understanding of the basic IBE techniques, will also find this book a valuable asset.
Soft computing, as an engineering science, and statistics, as a
classical branch of mathematics, emphasize different aspects of
data analysis.
To solve performance problems in modern computing infrastructures, often comprising thousands of servers running hundreds of applications, spanning multiple tiers, you need tools that go beyond mere reporting. You need tools that enable performance analysis of application workflow across the entire enterprise. That's what PDQ (Pretty Damn Quick) provides. PDQ is an open-source performance analyzer based on the paradigm of queues. Queues are ubiquitous in every computing environment as buffers, and since any application architecture can be represented as a circuit of queueing delays, PDQ is a natural fit for analyzing system performance. Building on the success of the first edition, this considerably expanded second edition now comprises four parts. Part I contains the foundational concepts, as well as a new first chapter that explains the central role of queues in successful performance analysis. Part II provides the basics of queueing theory in a highly intelligible style for the non-mathematician; little more than high-school algebra being required. Part III presents many practical examples of how PDQ can be applied. The PDQ manual has been relegated to an appendix in Part IV, along with solutions to the exercises contained in each chapter. Throughout, the Perl code listings have been newly formatted to improve readability. The PDQ code and updates to the PDQ manual are available from the author's web site at www.perfdynamics.com
st The world of the 21 century is, more than ever, global and impersonal. Criminal and terrorist threats, both physical and on the Internet, increase by the day. The demand for better methods of identification and access control is growing, not only in companies and organisations but also in the world at large. At the same time, such security measures have to be balanced with means for protecting the privacy of users. Identity management is put under pressure, due to the growing number of frauds who want to hide their true identity. This challenges the information security research community to focus on interdisciplinary and holistic approaches while retaining the benefits of previous research efforts. In this context, the IFIP Working Group 11.6 on Identity Management has been founded in August 2006. The intention of the Working Group is to offer a broad forum for the exchange of knowledge and for the tracking and discussion of issues and new developments. In this, we take an interdisciplinary approach. Scientists as well as practitioners, from government and business, who are involved in the field of identity management are welcome to participate. The IDMAN 2007 Conference on Policies and Research in Identity Management was the very first conference organized by this Working Group. We aim to organize conferences bi-annually. The IDMAN 2007 Conference has been centered around the theme of National Identity Management or, in other words, identity management in the public sector.
As business paradigm shifts from a desktop-centric environment to a data-centric mobile environment, mobile services provide numerous new business opportunities, and in some cases, challenge some of the basic premises of existing business models.Strategy, Adoption, and Competitive Advantage of Mobile Services in the Global Economy seeks to foster a scientific understanding of mobile services, provide a timely publication of current research efforts, and forecast future trends in the mobile services industry. This book is an ideal resource for academics, researchers, government policymakers, as well as corporate managers looking to enhance their competitive edge in or understanding of mobile services.
Data Management is the process of planning, coordinating and controlling data resources. More often, applications need to store and search a large amount of data. Managing Data has been continuously challenged by demands from various areas and applications and has evolved in parallel with advances in hardware and computing techniques. This volume focuses on its recent advances and it is composed of five parts and a total of eighteen chapters. The first part of the book contains five contributions in the area of information retrieval and Web intelligence: a novel approach to solving index selection problem, integrated retrieval from Web of documents and data, bipolarity in database querying, deriving data summarization through ontologies, and granular computing for Web intelligence. The second part of the book contains four contributions in knowledge discovery area. Its third part contains three contributions in information integration and data security area. The remaining two parts of the book contain six contributions in the area of intelligent agents and applications of data management in medical domain.
|
You may like...
Cryptographic Boolean Functions and…
Thomas W Cusick, Pantelimon Stanica
Paperback
R1,920
Discovery Miles 19 200
Playing with Teaching - Considerations…
Antero Garcia, Jennifer Dail, …
Hardcover
R3,411
Discovery Miles 34 110
Leadership for Learning - How to Bring…
Carl Glickman, Rebecca West Burns
Paperback
Long-term Research and Development in…
Avi Hofstein, Abraham Arcavi, …
Hardcover
R4,611
Discovery Miles 46 110
The Best Damn Cybercrime and Digital…
Jack Wiles, Anthony Reyes
Paperback
R1,269
Discovery Miles 12 690
Complex Analysis and Geometry - KSCV10…
Filippo Bracci, Jisoo Byun, …
Hardcover
|