![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > General theory of computing > Systems analysis & design
This textbook serves as an introduction to fault-tolerance, intended for upper-division undergraduate students, graduate-level students and practicing engineers in need of an overview of the field. Readers will develop skills in modeling and evaluating fault-tolerant architectures in terms of reliability, availability and safety. They will gain a thorough understanding of fault tolerant computers, including both the theory of how to design and evaluate them and the practical knowledge of achieving fault-tolerance in electronic, communication and software systems. Coverage includes fault-tolerance techniques through hardware, software, information and time redundancy. The content is designed to be highly accessible, including numerous examples and exercises. Solutions and powerpoint slides are available for instructors.
Systems Engineering Guidebook: A Process for Developing Systems and Products is intended to provide readers with a guide to understanding and becoming familiar with the systems engineering process, its application, and its value to the successful implementation of systems development projects. The book describes the systems engineering process as a multidisciplinary effort. The process is defined in terms of specific tasks to be accomplished, with great emphasis placed on defining the problem that is being addressed prior to designing the solution.
This innovative book recognizes the need within the object-oriented community for a book that goes beyond the tools and techniques of the typical methodology book. In Analysis Patterns: Reusable Object Models, Martin Fowler focuses on the end result of object-oriented analysis and design-the models themselves. He shares with you his wealth of object modeling experience and his keen eye for identifying repeating problems and transforming them into reusable models. Analysis Patterns provides a catalogue of patterns that have emerged in a wide range of domains including trading, measurement, accounting and organizational relationships. Recognizing that conceptual patterns cannot exist in isolation, the author also presents a series of "support patterns" that discuss how to turn conceptual models into software that in turn fits into an architecture for a large information system. Included in each pattern is the reasoning behind their design, rules for when they should and should not be used, and tips for implementation. The examples presented in this book comprise a cookbook of useful models and insight into the skill of reuse that will improve analysis, modeling and implementation.
Under the vast umbrella of Plant Sciences resides a plethora of highly specialized fields. Botanists, agronomists, horticulturists, geneticists, and physiologists each employ a different approach to the study of plants and each for a different end goal. Yet all will find themselves in the laboratory engaging in what can broadly be termed biotechnology. Addressing a wide variety of related topics, Plant Tissue Culture, Development, and Biotechnology gives the practical and technical knowledge needed to train the next generation of plant scientists regardless of their ultimate specialization. With the detailed perspectives and hands-on training signature to the authors previous bestselling books, Plant Development and Biotechnology and Plant Tissue Culture Concepts and Laboratory Exercises, this book discusses relevant concepts supported by demonstrative laboratory experiments. It provides critical thinking questions, concept boxes highlighting important ideas, and procedure boxes giving precise instruction for experiments, including step-by-step procedures, such as the proper microscope use with digital photography, along with anticipated results, and a list of materials needed to perform them. Integrating traditional plant sciences with recent advances in plant tissue culture, development, and biotechnology, chapters address germplasm preservation, plant growth regulators, embryo rescue, micropropagation of roses, haploid cultures, and transformation of meristems. Going beyond the scope of a simple laboratory manual, this book also considers special topics such as copyrights, patents, legalities, trade secrets, and the business of biotechnology. Focusing on plant culture development and its applications in biotechnology across a myriad of plant science specialties, this text uses a broad range of species and practical laboratory exercises to make it useful fo
In recent years, the use of technology for the purposes of
improving and enriching traditional instructional practices has
received a great deal of attention. However, few works have
explicitly examined cognitive, psychological, and educational
principles on which technology-supported learning environments are
based. This volume attempts to cover the need for a thorough
theoretical analysis and discussion of the principles of system
design that underlie the construction of technology-enhanced
learning environments. It presents examples of technology-supported
learning environments that cover a broad range of content domains,
from the physical sciences and mathematics to the teaching of
language and literacy.
Hybrid Intelligent Techniques for Pattern Analysis and Understanding outlines the latest research on the development and application of synergistic approaches to pattern analysis in real-world scenarios. An invaluable resource for lecturers, researchers, and graduates students in computer science and engineering, this book covers a diverse range of hybrid intelligent techniques, including image segmentation, character recognition, human behavioral analysis, hyperspectral data processing, and medical image analysis.
This book gives an in-depth introduction to the areas of modeling, identification, simulation, and optimization. These scientific topics play an increasingly dominant part in many engineering areas such as electrotechnology, mechanical engineering, aerospace, and physics. This book represents a unique and concise treatment of the mutual interactions among these topics.
This volume's goal is to begin to document the dialogue processes
in naturally-occurring human tutoring, in the context of informing
the design of intelligent tutoring systems, and of interactive
systems in general. This project represents the first empirical
study of human tutorial dialogue from a conversation analytic
perspective -- the conversational interaction is the focus of
analysis rather than larger scale techniques for teaching. It is
also the first study of tutoring to make use of large quantities of
carefully transcribed tutoring conversations/dialogues.
The TransNav 2011 Symposium held at the Gdynia Maritime University, Poland in June 2011 has brought together a wide range of participants from all over the world. The program has offered a variety of contributions, allowing to look at many aspects of the navigational safety from various different points of view. Topics presented and discussed at the Symposium were: navigation, safety at sea, sea transportation, education of navigators and simulator-based training, sea traffic engineering, ship's manoeuvrability, integrated systems, electronic charts systems, satellite, radio-navigation and anti-collision systems and many others. This book is part of a series of six volumes and provides an overview of Navigational Systems and Simulators and is addressed to scientists and professionals involved in research and development of navigation, safety of navigation and sea transportation.
The voices in this collection are primarily those of researchers
and developers concerned with bringing knowledge of technological
possibilities to bear on informed and effective system design.
Their efforts are distinguished from many previous writings on
system development by their central and abiding reliance on direct
and continuous interaction with those who are the ultimate arbiters
of system adequacy; namely, those who will use the technology in
their everyday lives and work. A key issue throughout is the
question of who does what to whom: whose interests are at stake,
who initiates action and for what reason, who defines the problem
and who decides that there is one.
This book seeks to establish an interdisciplinary, applied social
scientific model for researchers and students that advocates a
cooperative effort between machines and people. After showing that
basic research on social processes offers much needed guidance for
those creating technology and designing tools for group work, its
papers demonstrate the mutual relevance of social science and
information system design, and encourage better integration of
these disciplines.
The overwhelming majority of a software system's lifespan is spent in use, not in design or implementation. So, why does conventional wisdom insist that software engineers focus primarily on the design and development of large-scale computing systems? In this collection of essays and articles, key members of Google's Site Reliability Team explain how and why their commitment to the entire lifecycle has enabled the company to successfully build, deploy, monitor, and maintain some of the largest software systems in the world. You'll learn the principles and practices that enable Google engineers to make systems more scalable, reliable, and efficient-lessons directly applicable to your organization. This book is divided into four sections: Introduction-Learn what site reliability engineering is and why it differs from conventional IT industry practices Principles-Examine the patterns, behaviors, and areas of concern that influence the work of a site reliability engineer (SRE) Practices-Understand the theory and practice of an SRE's day-to-day work: building and operating large distributed computing systems Management-Explore Google's best practices for training, communication, and meetings that your organization can use
Reconfigurable computing techniques and adaptive systems are some of the most promising architectures for microprocessors. Reconfigurable and Adaptive Computing: Theory and Applications explores the latest research activities on hardware architecture for reconfigurable and adaptive computing systems. The first section of the book covers reconfigurable systems. The book presents a software and hardware codesign flow for coarse-grained systems-on-chip, a video watermarking algorithm for the H.264 standard, a solution for regular expressions matching systems, and a novel field programmable gate array (FPGA)-based acceleration solution with MapReduce framework on multiple hardware accelerators. The second section discusses network-on-chip, including an implementation of a multiprocessor system-on-chip platform with shared memory access, end-to-end quality-of-service metrics modeling based on a multi-application environment in network-on-chip, and a 3D ant colony routing (3D-ACR) for network-on-chip with three different 3D topologies. The final section addresses the methodology of system codesign. The book introduces a new software-hardware codesign flow for embedded systems that models both processors and intellectual property cores as services. It also proposes an efficient algorithm for dependent task software-hardware codesign with the greedy partitioning and insert scheduling method (GPISM) by task graph.
Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of significant, lasting value in this rapidly expanding field.
Control Engineering and Information Systems contains the papers presented at the 2014 International Conference on Control Engineering and Information Systems (ICCEIS 2014, Yueyang, Hunan, China, 20-22 June 2014). All major aspects of the theory and applications of control engineering and information systems are addressed, including: - Intelligent systems - Teaching cases - Pattern recognition - Industry application - Machine learning - Systems science and systems engineering - Data mining - Optimization - Business process management - Evolution of public sector ICT - IS economics - IS security and privacy - Personal data markets - Wireless ad hoc and sensor networks - Database and system security - Application of spatial information system - Other related areas Control Engineering and Information Systems provides a valuable source of information for scholars, researchers and academics in control engineering and information systems.
Modern-day projects require software and systems engineers to work together in realizing architectures of large and complex software-intensive systems. To date, the two have used their own tools and methods to deal with similar issues when it comes to the requirements, design, testing, maintenance, and evolution of these architectures. Software and Systems Architecture in Action explores practices that can be helpful in the development of architectures of large-scale systems in which software is a major component. Examining the synergies that exist between the disciplines of software and systems engineering, it presents concepts, techniques, and methods for creating and documenting architectures. The book describes an approach to architecture design that is driven from systemic quality attributes determined from both the business and technical goals of the system, rather than just its functional requirements. This architecture-centric design approach utilizes analytically derived patterns and tactics for quality attributes that inform the architect's design choices and help shape the architecture of a given system. The book includes coverage of techniques used to assess the impact of architecture-centric design on the structural complexity of a system. After reading the book, you will understand how to create architectures of systems and assess their ability to meet the business goals of your organization. Ideal for anyone involved with large and complex software-intensive systems, the book details powerful methods for engaging the software and systems engineers on your team. The book is also suitable for use in undergraduate and graduate-level courses on software and systems architecture as it exposes students to the concepts and techniques used to create and manage architectures of software-intensive systems.
Shaped by Quantum Theory, Technology, and the Genomics Revolution The integration of photonics, electronics, biomaterials, and nanotechnology holds great promise for the future of medicine. This topic has recently experienced an explosive growth due to the noninvasive or minimally invasive nature and the cost-effectiveness of photonic modalities in medical diagnostics and therapy. The second edition of the Biomedical Photonics Handbook presents recent fundamental developments as well as important applications of biomedical photonics of interest to scientists, engineers, manufacturers, teachers, students, and clinical providers. The third volume, Therapeutics and Advanced Biophotonics, focuses on therapeutic modalities, advanced biophotonic technologies, and future trends. " Represents the Collective Work of over 150 Scientists, Engineers, and Clinicians" Designed to display the most recent advances in instrumentation and methods, as well as clinical applications in important areas of biomedical photonics to a broad audience, this three-volume handbook provides an inclusive forum that serves as an authoritative reference source for a broad audience involved in the research, teaching, learning, and practice of medical technologies. " What s New in This Edition: " A wide variety of photonic biochemical sensing technologies has already been developed for clinical monitoring of early disease states and physiological parameters, such as blood pressure, blood chemistry, pH, temperature, and the presence of pathological organisms or biochemical species of clinical importance. Advanced photonic detection technologies integrating the latest knowledge of genomics, proteomics, and metabolomics allow sensing of early disease states, thus revolutionizing the medicine of the future. Nanobiotechnology has opened new possibilities for detection of biomarkers of disease, imaging single molecules and "in situ "diagnostics at the single-cell level. In addition to these state-of-the-art advancements, the second edition contains new topics and chapters including: Fiber Optic Probe Design Laser and Optical Radiation Safety Photothermal Detection Multidimensional Fluorescence Imaging Surface Plasmon Resonance Imaging Molecular Contrast Optical Coherence Tomography Multiscale Photoacoustics Polarized Light for Medical Diagnostics Quantitative Diffuse Reflectance Imaging Interferometric Light Scattering Nonlinear Interferometric Vibrational Imaging Nanoscintillator-Based Therapy SERS Molecular Sentinel Nanoprobes Plasmonic Coupling Interference Nanoprobes Comprised of three books: Volume I: Fundamentals, Devices, and Techniques; Volume II: Biomedical Diagnostics; and Volume III: Therapeutics and Advanced Biophotonics, this second edition contains eight sections, and provides introductory material in each chapter. It also includes an overview of the topic, an extensive collection of spectroscopic data, and a list of references for further reading."
As the complexity of today s networked computer systems grows,
they become increasingly difficult to understand, predict, and
control. Addressing these challenges requires new approaches to
building these systems. Adaptive, Dynamic, and Resilient Systems
supplies readers with various perspectives of the critical
infrastructure that systems of networked computers rely on. It
introduces the key issues, describes their interrelationships, and
presents new research in support of these areas.
Service computing is a cutting-edge area, popular in both industry and academia. New challenges have been introduced to develop service-oriented systems with high assurance requirements. High Assurance Services Computing captures and makes accessible the most recent practical developments in service-oriented high-assurance systems. An edited volume contributed by well-established researchers in this field worldwide, this book reports the best current practices and emerging methods in the areas of service-oriented techniques for high assurance systems. Available results from industry and government, R&D laboratories and academia are included, along with unreported results from the hands-on experiences of software professionals in the respective domains. Designed for practitioners and researchers working for industrial organizations and government agencies, High Assurance Services Computing is also suitable for advanced-level students in computer science and engineering."
There are essentially two theories of solutions that can be considered exact: the McMillan-Mayer theory and Fluctuation Solution Theory (FST). The first is mostly limited to solutes at low concentrations, while FST has no such issue. It is an exact theory that can be applied to any stable solution regardless of the number of components and their concentrations, and the types of molecules and their sizes. Fluctuation Theory of Solutions: Applications in Chemistry, Chemical Engineering, and Biophysics outlines the general concepts and theoretical basis of FST and provides a range of applications described by experts in chemistry, chemical engineering, and biophysics. The book, which begins with a historical perspective and an introductory chapter, includes a basic derivation for more casual readers. It is then devoted to providing new and very recent applications of FST. The first application chapters focus on simple model, binary, and ternary systems, using FST to explain their thermodynamic properties and the concept of preferential solvation. Later chapters illustrate the use of FST to develop more accurate potential functions for simulation, describe new approaches to elucidate microheterogeneities in solutions, and present an overview of solvation in new and model systems, including those under critical conditions. Expert contributors also discuss the use of FST to model solute solubility in a variety of systems. The final chapters present a series of biological applications that illustrate the use of FST to study cosolvent effects on proteins and their implications for protein folding. With the application of FST to study biological systems now well established, and given the continuing developments in computer hardware and software increasing the range of potential applications, FST provides a rigorous and useful approach for understanding a wide array of solution properties. This book outlines those approaches, and their advantages, ac
Informational Macrodynamics (IMD) presents the unified information systemic approach with common information language for modeling, analysis and optimization of a variety of interactive processes, such as physical, biological, economical, social, and informational, including human activities. Comparing it with thermodynamics, which deals with transformation energy and represents a theoretical foundation of physical technology, IMD deals with transformation information, and can be considered a theoretical foundation of Information Computer Technology (ICT). ICT includes but is not limited to applied computer science, computer information systems, computer and data communications, software engineering, and artificial intelligence. In ICT, information flows from different data sources, and interacts to create new information products. The information flows may interact physically or via their virtual connections, initiating an information dynamic process that can be distributed in space. As in physics, a problem is understanding general regularities of the information processes in terms of information law, for the engineering and technological design, control, optimization, and development of computer technology, operations, manipulations, and management of real information objects. Information Systems Analysis and Modeling: An Informational Macrodynamics Approach belongs to an interdisciplinary science that represents the new theoretical and computer-based methodology for system informational description and improvement, including various activities in such interdisciplinary areas as thinking, intelligent processes, management, and other nonphysical subjects with their mutual interactions, informational superimpositions, and the information transferred between interactions. Information Systems Analysis and Modeling: An Informational Macrodynamics Approach can be used as a textbook or secondary text in courses on computer science, engineering, business, management, education, and psychology and as a reference for research and industry.
The long-standing cultural imperative of augmenting human intellect continues to move ever closer to its full manifestation, described by Marshall McLuhan as an extension of the human nervous system. The escalating blending of immersive technologies with advanced computation has created an emerging domain which increasingly allows socio-technical system makers to produce not only human-computer interactions, but advanced, multi-minded human+computer (H+C) systems. The critical shift toward user immersion within systems of digital information and simulation makes the scale of immersive media's potential impact on human life, culture and well-being unlike that of any previous medium. In Designing XR, Peter (Zak) Zakrzewski presents H+C immersion as a multi-dimensional design problem - a Research Through Design (RTD) zone which addresses the question of: How can transformative design-thinking-based knowledge system complement the existing human-computer interaction (HCI) invention model to contribute to the creation of more participatory, socially viable, and humane immersive media environments? The book lays out a proposal for ushering the creation of ecologically sound augmented mind based on two essential tasks. The first involves a framework for the design, implementation, and iteration of purposeful, multi-minded, participatory immersive H+C systems. The second focuses on the extended reality experience (XRX) design practice that rhetorically invites users to actively engage with immersive systems while fully exercising their autonomy and agency based on informed choice.
This book, presented in three volumes, examines environmental disciplines in relation to major players in contemporary science: Big Data, artificial intelligence and cloud computing. Today, there is a real sense of urgency regarding the evolution of computer technology, the ever-increasing volume of data, threats to our climate and the sustainable development of our planet. As such, we need to reduce technology just as much as we need to bridge the global socio-economic gap between the North and South; between universal free access to data (open data) and free software (open source). In this book, we pay particular attention to certain environmental subjects, in order to enrich our understanding of cloud computing. These subjects are: erosion; urban air pollution and atmospheric pollution in Southeast Asia; melting permafrost (causing the accelerated release of soil organic carbon in the atmosphere); alert systems of environmental hazards (such as forest fires, prospective modeling of socio-spatial practices and land use); and web fountains of geographical data. Finally, this book asks the question: in order to find a pattern in the data, how do we move from a traditional computing model-based world to pure mathematical research? After thorough examination of this topic, we conclude that this goal is both transdisciplinary and achievable.
This book, presented in three volumes, examines environmental disciplines in relation to major players in contemporary science: Big Data, artificial intelligence and cloud computing. Today, there is a real sense of urgency regarding the evolution of computer technology, the ever-increasing volume of data, threats to our climate and the sustainable development of our planet. As such, we need to reduce technology just as much as we need to bridge the global socio-economic gap between the North and South; between universal free access to data (open data) and free software (open source). In this book, we pay particular attention to certain environmental subjects, in order to enrich our understanding of cloud computing. These subjects are: erosion; urban air pollution and atmospheric pollution in Southeast Asia; melting permafrost (causing the accelerated release of soil organic carbon in the atmosphere); alert systems of environmental hazards (such as forest fires, prospective modeling of socio-spatial practices and land use); and web fountains of geographical data. Finally, this book asks the question: in order to find a pattern in the data, how do we move from a traditional computing model-based world to pure mathematical research? After thorough examination of this topic, we conclude that this goal is both transdisciplinary and achievable. |
![]() ![]() You may like...
Operational Research - IO2017, Valenca…
A. Ismael F. Vaz, Joao Paulo Almeida, …
Hardcover
R2,948
Discovery Miles 29 480
Advances in Algorithms, Languages, and…
Dingzhu Du, Ker-I Ko
Hardcover
R5,828
Discovery Miles 58 280
A Level Maths Essentials Core 2 for AQA…
Janet Crawshaw, Karim Hirani
CD-ROM
R555
Discovery Miles 5 550
Visual Explorations in Finance - with…
Guido Deboeck, Teuvo Kohonen
Hardcover
R1,570
Discovery Miles 15 700
|