Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Showing 1 - 9 of 9 matches in All Departments
This book is a comprehensive introduction into Organic Computing (OC), presenting systematically the current state-of-the-art in OC. It starts with motivating examples of self-organising, self-adaptive and emergent systems, derives their common characteristics and explains the fundamental ideas for a formal characterisation of such systems. Special emphasis is given to a quantitative treatment of concepts like self-organisation, emergence, autonomy, robustness, and adaptivity. The book shows practical examples of architectures for OC systems and their applications in traffic control, grid computing, sensor networks, robotics, and smart camera systems. The extension of single OC systems into collective systems consisting of social agents based on concepts like trust and reputation is explained. OC makes heavy use of learning and optimisation technologies; a compact overview of these technologies and related approaches to self-organising systems is provided. So far, OC literature has been published with the researcher in mind. Although the existing books have tried to follow a didactical concept, they remain basically collections of scientific papers. A comprehensive and systematic account of the OC ideas, methods, and achievements in the form of a textbook which lends itself to the newcomer in this field has been missing so far. The targeted reader of this book is the master student in Computer Science, Computer Engineering or Electrical Engineering - or any other newcomer to the field of Organic Computing with some technical or Computer Science background. Readers can seek access to OC ideas from different perspectives: OC can be viewed (1) as a "philosophy" of adaptive and self-organising - life-like - technical systems, (2) as an approach to a more quantitative and formal understanding of such systems, and finally (3) a construction method for the practitioner who wants to build such systems. In this book, we first try to convey to the reader a feeling of the special character of natural and technical self-organising and adaptive systems through a large number of illustrative examples. Then we discuss quantitative aspects of such forms of organisation, and finally we turn to methods of how to build such systems for practical applications.
This book treats the computational use of social concepts as the focal point for the realisation of a novel class of socio-technical systems, comprising smart grids, public display environments, and grid computing. These systems are composed of technical and human constituents that interact with each other in an open environment. Heterogeneity, large scale, and uncertainty in the behaviour of the constituents and the environment are the rule rather than the exception. Ensuring the trustworthiness of such systems allows their technical constituents to interact with each other in a reliable, secure, and predictable way while their human users are able to understand and control them. "Trustworthy Open Self-Organising Systems" contains a wealth of knowledge, from trustworthy self-organisation mechanisms, to trust models, methods to measure a user's trust in a system, a discussion of social concepts beyond trust, and insights into the impact open self-organising systems will have on society.
Microelectronics are certainly one of the key-technologies of our time. They are a key factor of technological and economic progress. They effect the fields of automation, information and communication, leading to the development of new applications and markets. Attention should be focused on three areas of development: * process and production technology, * test technology, * design technology. Clearly, because of the development of new application fields, the skill ~f design ing integrated circuits should not be limited to a few, highly specialized experts Rather, this ability should be made available to all system aDd design engineers as a new application technology - just like nrogramrning technology for software. For this reason, design procedures havt: to be developed which, supported by appropriate CAD systems, provide the desIgn englIl~I' with tools for representaltop effective instruments for design and reliable *tools for verificatibn, ensuring simpre, proper and easily controllable interfaces for the manufacturing and test processes. Such CAD systems are called standard design systems. They open the way to fast and safe design of integrated circuits. First, this book demonstrates basic principles with an example of the Siemens design system VENUS, gives a general introduction to the method of designing integrated circuits, familiarizes the reader with basic semiconductor and circuit tech nologies, shows the various methods of layout design, and presents necessary con cepts and strategies of test technology.
Organic Computing has emerged as a challenging vision for future information processing systems. Its basis is the insight that we will increasingly be surrounded by and depend on large collections of autonomous systems, which are equipped with sensors and actuators, aware of their environment, communicating freely, and organising themselves in order to perform actions and services required by the users. These networks of intelligent systems surrounding us open fascinating ap-plication areas and at the same time bear the problem of their controllability. Hence, we have to construct such systems as robust, safe, flexible, and trustworthy as possible. In particular, a strong orientation towards human needs as opposed to a pure implementation of the tech-nologically possible seems absolutely central. The technical systems, which can achieve these goals will have to exhibit life-like or "organic" properties. "Organic Computing Systems" adapt dynamically to their current environmental conditions. In order to cope with unexpected or undesired events they are self-organising, self-configuring, self-optimising, self-healing, self-protecting, self-explaining, and context-aware, while offering complementary interfaces for higher-level directives with respect to the desired behaviour. First steps towards adaptive and self-organising computer systems are being undertaken. Adaptivity, reconfigurability, emergence of new properties, and self-organisation are hot top-ics in a variety of research groups worldwide. This book summarises the results of a 6-year priority research program (SPP) of the German Research Foundation (DFG) addressing these fundamental challenges in the design of Organic Computing systems. It presents and discusses the theoretical foundations of Organic Computing, basic methods and tools, learning techniques used in this context, architectural patterns and many applications. The final outlook shows that in the mean-time Organic Computing ideas have spawned a variety of promising new projects. "
TheARCSseriesofconferenceshasover30yearsoftraditionreportingtop-notch results in computer architecture and operating systems research. It is organized by the special interest group on "Computer and System Architecture"of the GI (Gesellschaft fur ] Informatik e.V.) and ITG (Informationstechnische Gesellschaft imVDE InformationTechnologySociety).In2010, ARCSwashostedbyLeibniz University Hannover. This year's special focus was on heterogeneous systems. The conference's topics comprised design aspects of multi-cores and memory systems, adaptive system architectures such as recon?gurable systems in hardware and software, customization and application-speci?c accelerators in heterogeneous archit- tures, organic and autonomic computing, energy-awareness, system aspects of ubiquitous and pervasive computing, and embedded systems. Thecallforpapersattractedabout55submissionsfromallaroundtheworld. Each submission was assigned to at least three members of the Program C- mittee for review. The Program Committee decided to accept 20 papers, which were arranged in seven sessions. The accepted papers are from Belgium, China, France, Germany, Italy, Spain, Turkey, and the UK. Two keynotes on hetero- neous systems complemented the strong technical program."
This book constitutes the refereed proceedings of the 22nd International Conference on Architecture of Computing Systems, ARCS 2009, held in Delft, The Netherlands, in March 2009. The 21 revised full papers presented together with 3 keynote papers were carefully reviewed and selected from 57 submissions. This year's special focus is set on energy awareness. The papers are organized in topical sections on compilation technologies, reconfigurable hardware and applications, massive parallel architectures, organic computing, memory architectures, enery awareness, Java processing, and chip-level multiprocessing.
This book constitutes the refereed proceedings of the 4th International Conference on Autonomic and Trusted Computing, ATC 2007, held in Hong Kong, China in July 2007, co-located with UIC 2007, the 4th International Conference on Ubiquitous Intelligence and Computing. The 55 revised full papers presented together with 1 keynote lecture were carefully reviewed and selected from 223 submissions. The papers are organized in topical sections on cryptography and signatures, autonomic computing and services, secure and trusted computing, autonomic models and architectures, trusted models and systems, intrusion detection, access control, trusted computing and communications, key management, worm detection and data security, secured services and applications, as well as fault-tolerant systems.
Where is system architecture heading? The special interest group on Computer and Systems Architecture (Fachausschuss Rechner- und Systemarchitektur) of the German computer and information technology associations GI and ITG a- ed this question and discussed it during two Future Workshops in 2002. The result in a nutshell: Everything will change but everything else will remain. Future systems technologies will build on a mature basis of silicon and IC technology, onwell-understoodprogramminglanguagesandsoftwareengineering techniques, and on well-established operating systems and middleware concepts. Newer and still exotic but exciting technologies like quantum computing and DNA processing are to be watched closely but they will not be mainstream in the next decade. Although there will be considerable progress in these basic technologies, is there any major trend which uni?es these diverse developments? There is a common denominator - according to the result of the two - ture Workshops - which marks a new quality. The challenge for future systems technologies lies in the mastering of complexity. Rigid and in?exible systems, built under a strict top-down regime, have reached the limits of manageable complexity, as has become obvious by the recent failure of several large-scale projects. Nature is the most complex system we know, and she has solved the problem somehow. We just haven't understood exactly how nature does it. But it is clear that systems designed by nature, like an anthill or a beehive or a swarm of birds or a city, are di?erent from today's technical systems that have beendesignedbyengineersandcomputerscientists.
"Rechenleistung an den Arbeitsplatz!" und "Jedem Entwickler seinen eigenen Rechner!" Die Realisierung dieser Forderungen moderner Datenverarbeitung be- stimmt wesentlich das weltweite Marktvolumen fur den Rechnereinsatz. Der Wunsch nach immer hoherer Rechenleistung, das Angebot hoher Verarbeitungs- gesch windigkei t, die ErschlieBung neuer und komplexerer Anwendungsfelder, die Verbesserung der Technologien und die Erweiterungen der Systemfunktionen sind Kriterien, die durch die Bereitstellung zentraler Rechenkapazitaten der Mainframes nicht mehr allein befriedigt werden konnen. Dezentralisierung von Rechenleistung und kooperative Strategien der Zusammenarbeit im Verbund sind Motivation und Motor fur die Entwicklung heutiger professioneller Work- stations. Faktoren fUr den Erfolg der Workstations, der sich mittelfristig in Wachstumsra- ten von 30% pro Jahr ausdruckt, sind: Technologie, die hochste Rechenleistung aufkleinstem Raum ermoglicht, Architektur, die diese Moglichkeiten in Verarbei- tungsleistung umsetzt, Graphik, die die notwendigen leistungsfahigen Benutzer- oberflachen schafft, Vernetzung, die sich bei Aufgabenverteilung und Kooperation auf Standards der Kommunikation stiitzt, und nicht zuletzt ein Standard- Betriebssystem als Basis fur die breite Verfugbarkeit von Anwender-Software. Das Aufkommen von RISC-Konzepten (Reduced Instruction Set Computer) sorgte fur einen Leistungssprung bei Prozessoren und Rechensystemen und erOffnet den Workstations neue Leistungsbereiche und Anwendungen. Daneben ist der Erfolg der Workstation in den letzten 10 J ahren untrennbar verknupft mit der Entwick- lung des allgemein verfugbaren und weitgehend standardisierten Betriebs- systems UNIX.
|
You may like...
Discovering Daniel - Finding Our Hope In…
Amir Tsarfati, Rick Yohn
Paperback
|