Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > General theory of computing
Managing Complexity is the first book that clearly defines the concept of Complexity, explains how Complexity can be measured and tuned, and describes the seven key features of Complex Systems: 1. Connectivity 2. Autonomy 3. Emergency 4. Nonequilibrium 5. Non-linearity 6. Self-organisation 7. Co-evolution The thesis of the book is that complexity of the environment in which we work and live offers new opportunities and that the best strategy for surviving and prospering under conditions of complexity is to develop adaptability to perpetually changing conditions. An effective method for designing adaptability into business processes using multi-agent technology is presented and illustrated by several extensive examples, including adaptive, real-time scheduling of taxis, see-going tankers, road transport, supply chains, railway trains, production processes and swarms of small space satellites. Additional case studies include adaptive servicing of the International Space Station; adaptive processing of design changes of large structures such as wings of the largest airliner in the world; dynamic data mining, knowledge discovery and distributed semantic processing.Finally, the book provides a foretaste of the next generation of complex issues, notably, The Internet of Things, Smart Cities, Digital Enterprises and Smart Logistics.
"Discrete-Time Linear Systems: Theory and Design with Applications "combines system theory and design in order to show the importance of system theory and its role in system design. The book focuses on system theory (including optimal state feedback and optimal state estimation) and system design (with applications to feedback control systems and wireless transceivers, plus system identification and channel estimation).
This book presents a systematic methodology for the development of parallel multi-physics models and its implementation in geophysical and biomedical applications. The methodology includes conservative discretization methods for partial differential equations on general meshes, as well as data structures and algorithms for organizing parallel simulations on general meshes. The structures and algorithms form the core of the INMOST (Integrated Numerical Modelling Object-oriented Supercomputing Technologies) platform for the development of parallel models on general meshes. The authors consider applications for addressing specific geophysical and biomedical challenges, including radioactive contaminant propagation with subsurface waters, reservoir simulation, and clot formation in blood flows. The book gathers all the components of this methodology, from algorithms and numerical methods to the open-source software, as well as examples of practical applications, in a single source, making it a valuable asset for applied mathematicians, computer scientists, and engineers alike.
Developments in industry in recent years have made employee learning a critical factor in organizations' success. The ever-faster pace of technological development and the variety of tasks that business professionals must perform mean that on-the-job learning is a constant, too quick and vital to be left to training departments. And yet, management knows too little about how workers learn on the job and does not give sufficient time and effort to understanding this process. As learning is largely left to chance, it is amazing that it happens at all, and well enough to enable workers to be productive and not to destroy each other's work. This book explores the daily work lives and learning experiences of programmers and other professionals in the computer-software industry. The book focuses on the staff of one small software firm, allowing workers to tell their own stories, describing their work and their use of all the resources available to them in learning the complex systems they are required to develop and maintain. Based in qualitative sociological method, it is an ethnography of a business setting as well as a study of learning. After describing the professional world in which programmers work, the book introduces the company to be discussed and the backgrounds of the participants in the study. Then, proceeding from the environment to the systems to be learned, the author schematizes all of the resources professionals use on the job--their experiences and thought processes, documentation, their colleagues, the computer, and the software system itself--as learning tools. All of this material is then related to academic models of learning style, which are mostly found not to be very relevant, as they are not grounded in the life experiences of workers. The author advocates that professionals' learning be modeled in context, that training be developed from experience rather than from theory, and that management strive to build a workplace and an organizational culture as conducive as possible to employees' continual learning.
The subject of Partial Differential Equations (PDEs) which first emerged in the 18th century holds an exciting and special position in the applications relating to the mathematical modelling of physical phenomena. The subject of PDEs has been developed by major names in Applied Mathematics such as Euler, Legendre, Laplace and Fourier and has applications to each and every physical phenomenon known to us e.g. fluid flow, elasticity, electricity and magnetism, weather forecasting and financial modelling. This book introduces the recent developments of PDEs in the field of Geometric Design particularly for computer based design and analysis involving the geometry of physical objects. Starting from the basic theory through to the discussion of practical applications the book describes how PDEs can be used in the area of Computer Aided Design and Simulation Based Design. Extensive examples with real life applications of PDEs in the area of Geometric Design are discussed in the book.
Soft computing, as an engineering science, and statistics, as a
classical branch of mathematics, emphasize different aspects of
data analysis.
As business paradigm shifts from a desktop-centric environment to a data-centric mobile environment, mobile services provide numerous new business opportunities, and in some cases, challenge some of the basic premises of existing business models.Strategy, Adoption, and Competitive Advantage of Mobile Services in the Global Economy seeks to foster a scientific understanding of mobile services, provide a timely publication of current research efforts, and forecast future trends in the mobile services industry. This book is an ideal resource for academics, researchers, government policymakers, as well as corporate managers looking to enhance their competitive edge in or understanding of mobile services.
These proceedings contain the papers selected for presentation at the 23rd Inter- tional Information Security Conference (SEC 2008), co-located with IFIP World Computer Congress (WCC 2008), September 8-10, 2008 in Milan, Italy. In - sponse to the call for papers, 143 papers were submitted to the conference. All - pers were evaluated on the basis of their signi?cance, novelty, and technical quality, and reviewed by at least three members of the program committee. Reviewing was blind meaning that the authors were not told which committee members reviewed which papers. The program committee meeting was held electronically, holding - tensive discussion over a period of three weeks. Of the papers submitted, 42 full papers and 11 short papers were selected for presentation at the conference. A conference like this just does not happen; it depends on the volunteer efforts of a host of individuals. There is a long list of people who volunteered their time and energy to put together the conference and who deserve acknowledgment. We thank all members of the program committee and the external reviewers for their hard work in the paper evaluation. Due to the large number of submissions, p- gram committee members were required to complete their reviews in a short time frame. We are especially thankful to them for the commitment they showed with their active participation in the electronic discussion
Reaction-diffusion and excitable media are amongst most intriguing substrates. Despite apparent simplicity of the physical processes involved the media exhibit a wide range of amazing patterns: from target and spiral waves to travelling localisations and stationary breathing patterns. These media are at the heart of most natural processes, including morphogenesis of living beings, geological formations, nervous and muscular activity, and socio-economic developments. This book explores a minimalist paradigm of studying reaction-diffusion and excitable media using locally-connected networks of finite-state machines: cellular automata and automata on proximity graphs. Cellular automata are marvellous objects per se because they show us how to generate and manage complexity using very simple rules of dynamical transitions. When combined with the reaction-diffusion paradigm the cellular automata become an essential user-friendly tool for modelling natural systems and designing future and emergent computing architectures. The book brings together hot topics of non-linear sciences, complexity, and future and emergent computing. It shows how to discover propagating localisation and perform computation with them in very simple two-dimensional automaton models. Paradigms, models and implementations presented in the book strengthen the theoretical foundations in the area for future and emergent computing and lay key stones towards physical embodied information processing systems.
The focus of these conference proceedings is on research, development, and applications in the fields of numerical geometry, scientific computing and numerical simulation, particularly in mesh generation and related problems. In addition, this year's special focus is on Voronoi diagrams and their applications, celebrating the 150th birthday of G.F. Voronoi. In terms of content, the book strikes a balance between engineering algorithms and mathematical foundations. It presents an overview of recent advances in numerical geometry, grid generation and adaptation in terms of mathematical foundations, algorithm and software development and applications. The specific topics covered include: quasi-conformal and quasi-isometric mappings, hyperelastic deformations, multidimensional generalisations of the equidistribution principle, discrete differential geometry, spatial and metric encodings, Voronoi-Delaunay theory for tilings and partitions, duality in mathematical programming and numerical geometry, mesh-based optimisation and optimal control methods. Further aspects examined include iterative solvers for variational problems and algorithm and software development. The applications of the methods discussed are multidisciplinary and include problems from mathematics, physics, biology, chemistry, material science, and engineering.
This book focuses on recent research in modern optimization and its implications in control and data analysis. This book is a collection of papers from the conference "Optimization and Its Applications in Control and Data Science" dedicated to Professor Boris T. Polyak, which was held in Moscow, Russia on May 13-15, 2015. This book reflects developments in theory and applications rooted by Professor Polyak's fundamental contributions to constrained and unconstrained optimization, differentiable and nonsmooth functions, control theory and approximation. Each paper focuses on techniques for solving complex optimization problems in different application areas and recent developments in optimization theory and methods. Open problems in optimization, game theory and control theory are included in this collection which will interest engineers and researchers working with efficient algorithms and software for solving optimization problems in market and data analysis. Theoreticians in operations research, applied mathematics, algorithm design, artificial intelligence, machine learning, and software engineering will find this book useful and graduate students will find the state-of-the-art research valuable.
To solve performance problems in modern computing infrastructures, often comprising thousands of servers running hundreds of applications, spanning multiple tiers, you need tools that go beyond mere reporting. You need tools that enable performance analysis of application workflow across the entire enterprise. That's what PDQ (Pretty Damn Quick) provides. PDQ is an open-source performance analyzer based on the paradigm of queues. Queues are ubiquitous in every computing environment as buffers, and since any application architecture can be represented as a circuit of queueing delays, PDQ is a natural fit for analyzing system performance. Building on the success of the first edition, this considerably expanded second edition now comprises four parts. Part I contains the foundational concepts, as well as a new first chapter that explains the central role of queues in successful performance analysis. Part II provides the basics of queueing theory in a highly intelligible style for the non-mathematician; little more than high-school algebra being required. Part III presents many practical examples of how PDQ can be applied. The PDQ manual has been relegated to an appendix in Part IV, along with solutions to the exercises contained in each chapter. Throughout, the Perl code listings have been newly formatted to improve readability. The PDQ code and updates to the PDQ manual are available from the author's web site at www.perfdynamics.com
Towards Solid-State Quantum Repeaters: Ultrafast, Coherent Optical Control and Spin-Photon Entanglement in Charged InAs Quantum Dots summarizes several state-of-the-art coherent spin manipulation experiments in III-V quantum dots. Both high-fidelity optical manipulation, decoherence due to nuclear spins and the spin coherence extraction are discussed, as is the generation of entanglement between a single spin qubit and a photonic qubit. The experimental results are analyzed and discussed in the context of future quantum technologies, such as quantum repeaters. Single spins in optically active semiconductor host materials have emerged as leading candidates for quantum information processing (QIP). The quantum nature of the spin allows for encoding of stationary, memory quantum bits (qubits), and the relatively weak interaction with the host material preserves the spin coherence. On the other hand, optically active host materials permit direct interfacing with light, which can be used for all-optical qubit manipulation, and for efficiently mapping matter qubits into photonic qubits that are suited for long-distance quantum communication.
CSIE 2011 is an international scientific Congress for distinguished scholars engaged in scientific, engineering and technological research, dedicated to build a platform for exploring and discussing the future of Computer Science and Information Engineering with existing and potential application scenarios. The congress has been held twice, in Los Angeles, USA for the first and in Changchun, China for the second time, each of which attracted a large number of researchers from all over the world. The congress turns out to develop a spirit of cooperation that leads to new friendship for addressing a wide variety of ongoing problems in this vibrant area of technology and fostering more collaboration over the world. The congress, CSIE 2011, received 2483 full paper and abstract submissions from 27 countries and regions over the world. Through a rigorous peer review process, all submissions were refereed based on their quality of content, level of innovation, significance, originality and legibility. 688 papers have been accepted for the international congress proceedings ultimately.
Weighted finite automata are classical nondeterministic finite automata in which the transitions carry weights. These weights may model, for example, the cost involved when executing a transition, the resources or time needed for this, or the probability or reliability of its successful execution. Weights can also be added to classical automata with infinite state sets like pushdown automata, and this extension constitutes the general concept of weighted automata. Since their introduction in the 1960s they have stimulated research in related areas of theoretical computer science, including formal language theory, algebra, logic, and discrete structures. Moreover, weighted automata and weighted context-free grammars have found application in natural-language processing, speech recognition, and digital image compression. This book covers all the main aspects of weighted automata and formal power series methods, ranging from theory to applications. The contributors are the leading experts in their respective areas, and each chapter presents a detailed survey of the state of the art and pointers to future research. The chapters in Part I cover the foundations of the theory of weighted automata, specifically addressing semirings, power series, and fixed point theory. Part II investigates different concepts of weighted recognizability. Part III examines alternative types of weighted automata and various discrete structures other than words. Finally, Part IV deals with applications of weighted automata, including digital image compression, fuzzy languages, model checking, and natural-language processing. Computer scientists and mathematicians will find this book an excellent survey and reference volume, and it will also be a valuable resource for students exploring this exciting research area.
Intelligent information and database systems are two closely related and we- established subfields of modern computer science. They focus on the integration of artificial intelligence and classic database technologies in order to create the class of next generation information systems. The major target of this new gene- tion of systems is to provide end-users with intelligent behavior: simple and/or advanced learning, problem solving, uncertain and certain reasoning, se- organization, cooperation, etc. Such intelligent abilities are implemented in classic information systems to make them autonomous and user oriented, in particular when advanced problems of multimedia information and knowledge discovery, access, retrieval and manipulation are to be solved in the context of large, distr- uted and heterogeneous environments. It means that intelligent knowledge-based information and database systems are used to solve basic problems of large coll- tions management, carry out knowledge discovery from large data collections, reason about information under uncertain conditions, support users in their for- lation of complex queries etc. Topics discussed in this volume include but are not limited to the foundations and principles of data, information, and knowledge models, methodologies for intelligent information and database systems analysis, design, implementation, validation, maintenance and evolution.
This book is a comprehensive, unifying introduction to the field of mathematical analysis and the mathematics of computing. It develops the relevant theory at a modern level and it directly relates modern mathematical ideas to their diverse applications. The authors develop the whole theory. Starting with a simple axiom system for the real numbers, they then lay the foundations, developing the theory, exemplifying where it's applicable, in turn motivating further development of the theory. They progress from sets, structures, and numbers to metric spaces, continuous functions in metric spaces, linear normed spaces and linear mappings; and then differential calculus and its applications, the integral calculus, the gamma function, and linear integral operators. They then present important aspects of approximation theory, including numerical integration. The remaining parts of the book are devoted to ordinary differential equations, the discretization of operator equations, and numerical solutions of ordinary differential equations. This textbook contains many exercises of varying degrees of difficulty, suitable for self-study, and at the end of each chapter the authors present more advanced problems that shed light on interesting features, suitable for classroom seminars or study groups. It will be valuable for undergraduate and graduate students in mathematics, computer science, and related fields such as engineering. This is a rich field that has experienced enormous development in recent decades, and the book will also act as a reference for graduate students and practitioners who require a deeper understanding of the methodologies, techniques, and foundations.
The book covers various topics of computer algebra methods, algorithms and software applied to scientific computing. An important topic presented in the book, which may be of interest to researchers and engineers, is the application of computer algebra methods to the development of new efficient analytic and numerical solvers, both for ordinary and partial differential equations. A specific feature of the book is an intense use of advanced software systems such as Mathematica, Maple etc. for the solution of problems as outlined above and for the industrial application of computer algebra for simulation. The book will be useful for researchers and engineers who apply advanced computer algebra methods for the solution of their problems.
This book presents advances in alternative swarm development that have proved to be effective in several complex problems. Swarm intelligence (SI) is a problem-solving methodology that results from the cooperation between a set of agents with similar characteristics. The study of biological entities, such as animals and insects, manifesting social behavior has resulted in several computational models of swarm intelligence. While there are numerous books addressing the most widely known swarm methods, namely ant colony algorithms and particle swarm optimization, those discussing new alternative approaches are rare. The focus on developments based on the simple modification of popular swarm methods overlooks the opportunity to discover new techniques and procedures that can be useful in solving problems formulated by the academic and industrial communities. Presenting various novel swarm methods and their practical applications, the book helps researchers, lecturers, engineers and practitioners solve their own optimization problems.
This book grew out of the Fourth Conference on Computers and the Writing process, held at the University of Sussex in March 1991. Fifteen refereed papers were selected from the conference and the authors were asked to develop them into chapters appropriate for this book, incorporating insights gained from their conference presentations. The book covers all aspects of computers and the writing process, including computer-based collaborative writing, hypertext, computers and writing education, computer and professional authors, evaluation of computer-based writing, computers and technical writing, and computer supported fiction. The resulting collection provides an up-to-date cross-section of this increasingly important interdisciplinary topic - with computer, cognitive and educational perspectives covered. The book will be of interest to workers and researchers in language, cognition and computer science; especially those interested in hypermedia, human - computer interaction and cooperative technologies.
This book introduces the concept of policy decision emergence and its dynamics at the sub systemic level of the decision process. This level constitutes the breeding ground of the emergence of policy decisions but remains unexplored due to the absence of adequate tools. It is a nonlinear complex system made of several entities that interact dynamically. The behavior of such a system cannot be understood with linear and deterministic methods. The book presents an innovative multidisciplinary approach that results in the development of a Policy Decision Emergence Simulation Model (PODESIM). This computational model is a multi-level fuzzy inference system that allows the identification of the decision emergence levers. This development represents a major advancement in the field of public policy decision studies. It paves the way for decision emergence modeling and simulation by bridging complex systems theory, multiple streams theory, and fuzzy logic theory.
Silicon technology now allows us to build chips consisting of tens of millions of transistors. This technology not only promises new levels of system integration onto a single chip, but also presents significant challenges to the chip designer. As a result, many ASIC developers and silicon vendors are re-examining their design methodologies, searching for ways to make effective use of the huge numbers of gates now available. These designers see current design tools and methodologies as inadequate for developing million-gate ASICs from scratch. There is considerable pressure to keep design team size and design schedules constant even as design complexities grow. Tools are not providing the productivity gains required to keep pace with the increasing gate counts available from deep submicron technology. Design reuse - the use of pre-designed and pre-verified cores - is the most promising opportunity to bridge the gap between available gate-count and designer productivity. Reuse Methodology Manual for System-On-A-Chip Designs, Second Edition outlines an effective methodology for creating reusable designs for use in a System-on-a-Chip (Soe design methodology. Silicon and tool technologies move so quickly that no single methodology can provide a permanent solution to this highly dynamic problem. Instead, this manual is an attempt to capture and incrementally improve on current best practices in the industry, and to give a coherent, integrated view of the design process. Reuse Methodology Manual for System-On-A-Chip Designs, Second Edition will be updated on a regular basis as a result of changing technology and improved insight into the problems of design reuse and its role in producinghigh-quality SoC designs.
This innovative and in-depth book integrates the well-developed
theory and practical applications of one dimensional and
multidimensional multirate signal processing. Using a rigorous
mathematical framework, it carefully examines the fundamentals of
this rapidly growing field. Areas covered include: basic building
blocks of multirate signal processing; fundamentals of
multidimensional multirate signal processing; multirate filter
banks; lossless lattice structures; introduction to wavelet signal
processing. |
You may like...
Discovering Computers, Essentials…
Susan Sebok, Jennifer Campbell, …
Paperback
Discovering Computers 2018 - Digital…
Misty Vermaat, Steven Freund, …
Paperback
Information Systems, International…
Ralph Stair, George Reynolds
Paperback
Discovering Computers (c)2017
Mark Frydenberg, Misty Vermaat, …
Paperback
(3)
R966 Discovery Miles 9 660
Dynamic Web Application Development…
David Parsons, Simon Stobart
Paperback
Technology In Action Complete, Global…
Alan Evans, Kendall Martin, …
Paperback
R2,490
Discovery Miles 24 900
|