![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Applications of computing > General
This volume brings together recent theoretical work in Learning Classifier Systems (LCS), which is a Machine Learning technique combining Genetic Algorithms and Reinforcement Learning. It includes self-contained background chapters on related fields (reinforcement learning and evolutionary computation) tailored for a classifier systems audience and written by acknowledged authorities in their area - as well as a relevant historical original work by John Holland.
This book describes a cross-domain architecture and design tools for networked complex systems where application subsystems of different criticality coexist and interact on networked multi-core chips. The architecture leverages multi-core platforms for a hierarchical system perspective of mixed-criticality applications. This system perspective is realized by virtualization to establish security, safety and real-time performance. The impact further includes a reduction of time-to-market, decreased development, deployment and maintenance cost, and the exploitation of the economies of scale through cross-domain components and tools. Describes an end-to-end architecture for hypervisor-level, chip-level, and cluster level. Offers a solution for different types of resources including processors, on-chip communication, off-chip communication, and I/O. Provides a cross-domain approach with examples for wind-power, health-care, and avionics. Introduces hierarchical adaptation strategies for mixed-criticality systems Provides modular verification and certification methods for the seamless integration of mixed-criticality systems. Covers platform technologies, along with a methodology for the development process. Presents an experimental evaluation of technological results in cooperation with industrial partners. The information in this book will be extremely useful to industry leaders who design and manufacture products with distributed embedded systems in mixed-criticality use-cases. It will also benefit suppliers of embedded components or development tools used in this area. As an educational tool, this material can be used to teach students and working professionals in areas including embedded systems, computer networks, system architecture, dependability, real-time systems, and avionics, wind-power and health-care systems.
Mathematical Visualization is a young new discipline. It offers
efficient visualization tools to the classical subjects of
mathematics, and applies mathematical techniques to problems in
computer graphics and scientific visualization. Originally, it
started in the interdisciplinary area of differential geometry,
numerical mathematics, and computer graphics. In recent years, the
methods developed have found important applications.
This book provides a framework for the design of competent optimization techniques by combining advanced evolutionary algorithms with state-of-the-art machine learning techniques. The primary focus of the book is on two algorithms that replace traditional variation operators of evolutionary algorithms, by learning and sampling Bayesian networks: the Bayesian optimization algorithm (BOA) and the hierarchical BOA (hBOA). They provide a scalable solution to a broad class of problems. The book provides an overview of evolutionary algorithms that use probabilistic models to guide their search, motivates and describes BOA and hBOA in a way accessible to a wide audience, and presents numerous results confirming that they are revolutionary approaches to black-box optimization.
This research handbook provides a comprehensive, integrative, and authoritative resource on the main strategic management issues for companies within the e-business context. It covers an extensive set of topics, dealing with the major issues which articulate the e-business framework from a business perspective. The handbook is divided into the following e-business related parts: background; evolved strategic framework for the management of companies; key business processes, areas and activities; and, finally, emerging issues, trends and opportunities, with special attention to diverse Social Web-related implications. The articles are varied, timely and present high-quality research; many of these unique contributions will be especially valued and influential for business scholars and professionals interested in e-business. Many of the contributors are outstanding business scholars who are or have been editors-in-chief of top-ranked management and business journals or have made significant contributions to the development of their respective fields.
Knowledge Discovery today is a significant study and research area. In finding answers to many research questions in this area, the ultimate hope is that knowledge can be extracted from various forms of data around us. This book covers recent advances in unsupervised and supervised data analysis methods in Computational Intelligence for knowledge discovery. In its first part the book provides a collection of recent research on distributed clustering, self organizing maps and their recent extensions. If labeled data or data with known associations are available, we may be able to use supervised data analysis methods, such as classifying neural networks, fuzzy rule-based classifiers, and decision trees. Therefore this book presents a collection of important methods of supervised data analysis. "Classification and Clustering for Knowledge Discovery" also includes variety of applications of knowledge discovery in health, safety, commerce, mechatronics, sensor networks, and telecommunications.
Protein informatics is a newer name for an already existing discipline. It encompasses the techniques used in bioinformatics and molecular modeling that are related to proteins. While bioinformatics is mainly concerned with the collection, organization, and analysis of biological data, molecular modeling is devoted to representation and manipulation of the structure of proteins. Protein informatics requires substantial prerequisites on computer science, mathematics, and molecular biology. The approach chosen here, allows a direct and rapid grasp on the subject starting from basic knowledge of algorithm design, calculus, linear algebra, and probability theory. An Introduction to Protein Informatics, a professional monograph will provide the reader a comprehensive introduction to the field of protein informatics. The text emphasizes mathematical and computational methods to tackle the central problems of alignment, phylogenetic reconstruction, and prediction and sampling of protein structure. An Introduction to Protein Informatics is designed for a professional audience, composed of researchers and practitioners within bioinformatics, molecular modeling, algorithm design, optimization, and pattern recognition. This book is also suitable as a graduate-level text for students in computer science, mathematics, and biomedicine.
Designing Secure IoT devices with the Arm Platform Security Architecture and Cortex-M33 explains how to design and deploy secure IoT devices based on the Cortex-M23/M33 processor. The book is split into three parts. First, it introduces the Cortex-M33 and its architectural design and major processor peripherals. Second, it shows how to design secure software and secure communications to minimize the threat of both hardware and software hacking. And finally, it examines common IoT cloud systems and how to design and deploy a fleet of IoT devices. Example projects are provided for the Keil MDK-ARM and NXP LPCXpresso tool chains. Since their inception, microcontrollers have been designed as functional devices with a CPU, memory and peripherals that can be programmed to accomplish a huge range of tasks. With the growth of internet connected devices and the Internet of Things (IoT), "plain old microcontrollers" are no longer suitable as they lack the features necessary to create both a secure and functional device. The recent development by ARM of the Cortex M23 and M33 architecture is intended for today's IoT world.
This book presents an up-to-date account of research in important topics of fuzzy group theory. It concentrates on the theoretical aspects of fuzzy subgroups of a group. It includes applications to abstract recognition problems and to coding theory. The book begins with basic properties of fuzzy subgroups. Fuzzy subgroups of Hamiltonian, solvable, P-Hall, and nilpotent groups are discussed. Construction of free fuzzy subgroups is determined. Numerical invariants of fuzzy subgroups of Abelian groups are developed. The problem in group theory of obtaining conditions under which a group can be expressed as a direct product of its normal subgroups is considered. Methods for deriving fuzzy theorems from crisp ones are presented and the embedding of lattices of fuzzy subgroups into lattices of crisp groups is discussed as well as deriving membership functions from similarity relations. The material presented makes this book a good reference for graduate students and researchers working in fuzzy group theory.
This book is written from an engineer's perspective of the mind. "Artificial Mind System" exposes the reader to a broad spectrum of interesting areas in general brain science and mind-oriented studies. In this research monograph a picture of the holistic model of an artificial mind system and its behaviour is drawn, as concretely as possible, within a unified context, which could eventually lead to practical realisation in terms of hardware or software. With a view that "the mind is a system always evolving," ideas inspired by many branches of studies related to brain science are integrated within the text, i.e. artificial intelligence, cognitive science / psychology, connectionism, consciousness studies, general neuroscience, linguistics, pattern recognition / data clustering, robotics, and signal processing.
"Wireless is coming" was the message received by VLSI designers in the early 1990's. They believed it. But they never imagined that the wireless wave would be coming with such intensity and speed. Today one of the most challenging areas for VLSI designers is VLSI circuit and system design for wireless applications. New generation of wireless systems, which includes multimedia, put severe constraints on performance, cost, size, power and energy. The challenge is immense and the need for new generation of VLSI designers, who are fluent in wireless communication and are masters of mixed signal design, is great. No single text or reference book contains the necessary material to educate such needed new generation of VLSIdesigners. There are gaps. Excellent books exist on communication theory and systems, including wireless applications and others treat well basic digital, analog and mixed signal VLSI design. We feel that this book is the first of its kind to fill that gap. In the first half of this book we offer the reader (the VLSI designer) enough material to understand wireless communication systems. We start with a historical account. And then we present an overview of wireless communication systems. This is followed by detailed treatment of related topics; the mobile radio, digital modulation and schemes, spread spectrum and receiver architectures. The second half of the book deals with VLSI design issues related to mixed-signal design. These include analog-to-digital conversion, transceiver design, digital low-power techniques, amplifier design, phase locked loops and frequency synthesizers.
Rem tene, verba sequentur (Gaius J. Victor, Rome VI century b.c.) The ultimate goal of this book is to bring the fundamental issues of information granularity, inference tools and problem solving procedures into a coherent, unified, and fully operational framework. The objective is to offer the reader a comprehensive, self-contained, and uniform exposure to the subject.The strategy is to isolate some fundamental bricks of Computational Intelligence in terms of key problems and methods, and discuss their implementation and underlying rationale within a well structured and rigorous conceptual framework as well as carefully related to various application facets. The main assumption is that a deep understanding of the key problems will allow the reader to compose into a meaningful mosaic the puzzle pieces represented by the immense varieties of approaches present in the literature and in the computational practice. All in all, the main approach advocated in the monograph consists of a sequence of steps offering solid conceptual fundamentals, presenting a carefully selected collection of design methodologies, discussing a wealth of development guidelines, and exemplifying them with a pertinent, accurately selected illustrative material.
The evolution of modern computers began more than 50 years ago and has been driven to a large extend by rapid advances in electronic technology during that period. The first computers ran one application (user) at a time. Without the benefit of operating systems or compilers, the application programmers were responsible for managing all aspects of the hardware. The introduction of compilers allowed programmers to express algorithms in abstract terms without being concerned with the bit level details of their implementation. Time sharing operating systems took computing systems one step further and allowed several users and/or applications to time share the computing services of com puters. With the advances of networks and software tools, users and applications were able to time share the logical and physical services that are geographically dispersed across one or more networks. Virtual Computing (VC) concept aims at providing ubiquitous open computing services in analogous way to the services offered by Telephone and Elec trical (utility) companies. The VC environment should be dynamically setup to meet the requirements of a single user and/or application. The design and development of a dynamically programmable virtual comput ing environments is a challenging research problem. However, the recent advances in processing and network technology and software tools have successfully solved many of the obstacles facing the wide deployment of virtual computing environments as will be outlined next."
Organizational Semiotics occupies an important niche in the research community of human communication and information systems. It opens up new ways of understanding the functioning of information and information resources in organised behaviour. In recent years, a numberof workshops and conferences have provided researchers and practitioners opportunities to discuss their theories, methods and practices and to assess the benefits and potential of this approach. Literature in this field is much in demand but still difficult to find, so we are pleased to offer a third volume in the miniseries of Studies in Organizational Semiotics. This book is based on the papers and discussions of the fifth workshop on Organizational Semiotics held in Delft, June 13-15, 2002, hosted by Groningen University and Delft Technical University in the Netherlands. The topic of this workshop was the dynamics and change in organizations. The chapters in this book reflect recent developments in theory and applications and demonstrate the significance of Organizational Semiotics to information systems, human communication and coordination, organizational analysis and modelling. In particular, it provides a framework that accommodates both the technical and social aspects of information systems. The mini-series presents the frontier of the research in this area and shows how the theory and techniques enhance the quality of work on information systems.
The application of Computational Intelligence in emerging research areas such as Granular Computing, Mechatronics, and Bioinformatics shows its usefulness often emphasized by Prof Lotfi Zadeh, the inventor of fuzzy logic and many others. This book contains recent advances in Computational Intelligence methods for modeling, optimization and prediction and covers a large number of applications. The book presents new Computational Intelligence theory and methods for modeling and prediction. The range of the various applications is captured with 5 chapters in image processing, 2 chapters in audio processing, 3 chapters in commerce and finance, 2 chapters in communication networks and 6 chapters containing other applications.
Games and simulations are not only a rapidly growing source of entertainment in today's world; they are also quite beneficial. They enable players to develop quick-reaction and motor skills, engage cognitive processes, and interact with peers around the globe, thereby enhancing social skills. However, as a result of the rise of games and simulations, educators are struggling to engage their students through more traditional ways of learning. Educational Gameplay and Simulation Environments: Case Studies and Lessons Learned presents a remarkable collection of cases demonstrating how to conceptualize, design, and implement games and simulations effectively for learning. This paramount publication will aid educators, researchers, and game developers in broadening their work to effectively create and implement engaging learning environments for present and future students.
Governments across the world have recognised the potential of new information and communication technologies (ICTs) to bring about fundamental renewal in not only government and public sector processes, but also their relationship with civil societal groups, the private sector, citizens, and various other actors. ICT provides enormous opportunities to increase efficiency and effectiveness in all kinds of policy sectors, and promises a real dialogue between policy makers and the public. This second edition of the prescient and influential work first published in 2001 includes updated texts of several chapters from the earlier edition as well as various chapters, among them a number of country reports written for the e-government session of the of the 17th World Congress of Comparative Law. In addition to visions of the concept of electronic government, it provides examples of already active electronic governance by including various chapters on developments in the United States (both federal and state), the United Kingdom, Canada, Germany, Italy, Denmark, and the Netherlands. It draws valuable lessons (cross-national, between policy sectors and across administrations) from the design of electronic government and from evaluations of electronic government in practice. Aspects of e-government covered in the second edition include the following: government initiatives such as e-publication, online filing (including e-procurement and courts e-filing); 'e-democracy' features such as e-voting, e-participation, e-consultation and e-petitioning; benefits of government use of such expanding technologies as global positioning systems, smartcards, and biometrics; benefits to citizens services such as social security and services in the health care sector; applications to the judicial system and law enforcement; differences between developments and policy initiatives in various countries; and, obstacles and dilemmas touching upon security, surveillance, identity fraud, liability, intellectual property, free access, national security, equality, and privacy. Especially in its close attention to the interaction between legal, practical, public administration and ethical obstacles and dilemmas, "Designing E- Government, Second Edition" is of enormous value to practitioners, officials, and policymakers concerned with the legal implications related to the design and implementation of e-government, and with the present and future challenges of this endeavour.
Bioinformatics involves specialized application of computer technology to investigative and conceptual problems in biology and medicine; neuroinformatics (NI) is the practice of bioinformatics in the neurosciences. Over the past two decades the biomedical sciences have been revolutionized by databases, data mining and data modeling techniques. The Human Genome Project, which depended on informatics methods, has been the most well recognized bioinformatics undertaking. Bioinformatics has since been applied all across biology and medicine, and has also transformed almost every avenue in neuroscience. Yet in neuropsychology, NI perspectives remain largely unrealized. Ironically, NI offers enormous potential to the essential praxis of neuropsychology - assessing cognitive behavior and relating cognition to neural systems. Neuroinformatics can be applied to neuropsychology as richly as it has been applied across the neurosciences. Neuroinformatics for Neuropsychology is the first book to explain the relevance and value of NI to neuropsychology. It systematically describes NI tools, applications and models that can enhance the efforts of neuropsychologists. It also describes the implications of NI for neuropsychology in the 21st century fundamental shifts away from the conventional modes of research, practice and communication that have thus far characterized the field. One of the foremost experts on the subject:
A vital introduction to a profound technological practice, Neuroinformatics for Neuropsychology is important reading for clinical neuropsychologists, cognitive neuroscientists, behavioral neurologists, and speech-language pathologists. Researchers, clinicians, and graduate students interested in informatics for the brain-behavioral sciences will especially welcome this unique volume."
This book addresses two main themes. The first is, the discipline of informatics. Two major questions will be discussed: how can we obtain and keep track of a systematic and objective overview of the vast landscape in higher informatics education, both nationally and internationally? and would it be useful to rationalize and redesign the informatics curricula, leading to less fragmentation and more communality? The second theme is the relation between informatics and other disciplines, with the following main questions: what informatics do we need to offer a coherent curriculum which suits the needs of the actual information society with respect to specific disciplines? what is relevant in informatics and CIT to provide to others? and what informatics concepts, methods and techniques form the hard core needed in every other discipline?
In March 2002, the Naval Research Laboratory brought together leading researchers and government sponsors for a three-day workshop in Washington, D.C. on Multi-Robot Systems. The workshop began with presentations by various government program managers describing application areas and programs with an interest in multi robot systems. Government representatives were on hand from the Office of Naval Research, the Air Force, the Army Research Lab, the National Aeronau tics and Space Administration, and the Defense Advanced Research Projects Agency. Top researchers then presented their current activities in the areas of multi robot systems and human-robot interaction. The first two days of the workshop of1ocalizatio . concentrated on multi-robot control issues, including the topics mapping, and navigation; distributed surveillance; manipulation; coordination and formations; and sensors and hardware. The third day was focused on hu man interactions with multi-robot teams. All presentations were given in a single-track workshop format. This proceedings documents the work presented by these researchers at the workshop. The invited presentations were followed by panel discussions, in which all participants interacted to highlight the challenges of this field and to develop possible solutions. In addition to the invited research talks, students were given an opportunity to present their work at poster sessions."
This book covers the recent applications of computational intelligence techniques in reliability engineering. This volume contains a survey of the contributions made to the optimal reliability design literature in recent years. It also contains chapters devoted to different applications of a genetic algorithm in reliability engineering and to combinations of this algorithm with other computational intelligence techniques.
This textbook presents mathematical models in bioinformatics and describes biological problems that inspire the computer science tools used to manage the enormous data sets involved. The first part of the book covers mathematical and computational methods, with practical applications presented in the second part. The mathematical presentation avoids unnecessary formalism, while remaining clear and precise. The book closes with a thorough bibliography, reaching from classic research results to very recent findings. This volume is suited for a senior undergraduate or graduate course on bioinformatics, with a strong focus on mathematical and computer science background.
This book covers performance analysis of computer networks, and begins by providing the necessary background in probability theory, random variables, and stochastic processes. Queuing theory and simulation are introduced as the major tools analysts have access to. It presents performance analysis on local, metropolitan, and wide area networks, as well as on wireless networks. It concludes with a brief introduction to self-similarity. Designed for a one-semester course for senior-year undergraduates and graduate engineering students, it may also serve as a fingertip reference for engineers developing communication networks, managers involved in systems planning, and researchers and instructors of computer communication networks. |
![]() ![]() You may like...
Kirstenbosch - A Visitor's Guide
Colin Paterson-Jones, John Winter
Paperback
Promoting Inclusive Growth in the Fourth…
Sheryl Beverley Buckley
Hardcover
R5,824
Discovery Miles 58 240
Carbs & Cals Soups - 80 Healthy Soup…
Chris Cheyette, Yello Balolia
Paperback
![]() R358 Discovery Miles 3 580
|