0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (1)
  • R100 - R250 (6)
  • R250 - R500 (54)
  • R500+ (2,668)
  • -
Status
Format
Author / Contributor
Publisher

Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)

Logic Synthesis Using Synopsys (R) (Paperback, 2nd ed. 1997): Pran Kurup, Taher Abbasi Logic Synthesis Using Synopsys (R) (Paperback, 2nd ed. 1997)
Pran Kurup, Taher Abbasi
R4,025 Discovery Miles 40 250 Ships in 18 - 22 working days

Logic Synthesis Using Synopsys (R), Second Edition is for anyone who hates reading manuals but would still like to learn logic synthesis as practised in the real world. Synopsys Design Compiler, the leading synthesis tool in the EDA marketplace, is the primary focus of the book. The contents of this book are specially organized to assist designers accustomed to schematic capture-based design to develop the required expertise to effectively use the Synopsys Design Compiler. Over 100 `Classic Scenarios' faced by designers when using the Design Compiler have been captured, discussed and solutions provided. These scenarios are based on both personal experiences and actual user queries. A general understanding of the problem-solving techniques provided should help the reader debug similar and more complicated problems. In addition, several examples and dc_shell scripts (Design Compiler scripts) have also been provided. Logic Synthesis Using Synopsys (R), Second Edition is an updated and revised version of the very successful first edition. The second edition covers several new and emerging areas, in addition to improvements in the presentation and contents in all chapters from the first edition. With the rapid shrinking of process geometries it is becoming increasingly important that `physical' phenomenon like clusters and wire loads be considered during the synthesis phase. The increasing demand for FPGAs has warranted a greater focus on FPGA synthesis tools and methodology. Finally, behavioral synthesis, the move to designing at a higher level of abstraction than RTL, is fast becoming a reality. These factors have resulted in the inclusion of separate chapters in the second edition to cover Links to Layout, FPGA Synthesis and Behavioral Synthesis, respectively. Logic Synthesis Using Synopsys (R), Second Edition has been written with the CAD engineer in mind. A clear understanding of the synthesis tool concepts, its capabilities and the related CAD issues will help the CAD engineer formulate an effective synthesis-based ASIC design methodology. The intent is also to assist design teams to better incorporate and effectively integrate synthesis with their existing in-house design methodology and CAD tools.

Software Synthesis from Dataflow Graphs (Paperback, Softcover reprint of the original 1st ed. 1996): Shuvra S. Bhattacharyya,... Software Synthesis from Dataflow Graphs (Paperback, Softcover reprint of the original 1st ed. 1996)
Shuvra S. Bhattacharyya, Praveen K. Murthy, Edward A. Lee
R2,628 Discovery Miles 26 280 Ships in 18 - 22 working days

Software Synthesis from Dataflow Graphs addresses the problem of generating efficient software implementations from applications specified as synchronous dataflow graphs for programmable digital signal processors (DSPs) used in embedded real- time systems. The advent of high-speed graphics workstations has made feasible the use of graphical block diagram programming environments by designers of signal processing systems. A particular subset of dataflow, called Synchronous Dataflow (SDF), has proven efficient for representing a wide class of unirate and multirate signal processing algorithms, and has been used as the basis for numerous DSP block diagram-based programming environments such as the Signal Processing Workstation from Cadence Design Systems, Inc., COSSAP from Synopsys (R) (both commercial tools), and the Ptolemy environment from the University of California at Berkeley. A key property of the SDF model is that static schedules can be determined at compile time. This removes the overhead of dynamic scheduling and is thus useful for real-time DSP programs where throughput requirements are often severe. Another constraint that programmable DSPs for embedded systems have is the limited amount of on-chip memory. Off-chip memory is not only expensive but is also slower and increases the power consumption of the system; hence, it is imperative that programs fit in the on-chip memory whenever possible. Software Synthesis from Dataflow Graphs reviews the state-of-the-art in constructing static, memory-optimal schedules for programs expressed as SDF graphs. Code size reduction is obtained by the careful organization of loops in the target code. Data buffering is optimized by constructing the loop hierarchy in provably optimal ways for many classes of SDF graphs. The central result is a uniprocessor scheduling framework that provably synthesizes the most compact looping structures, called single appearance schedules, for a certain class of SDF graphs. In addition, algorithms and heuristics are presented that generate single appearance schedules optimized for data buffering usage. Numerous practical examples and extensive experimental data are provided to illustrate the efficacy of these techniques.

Simulated Annealing for VLSI Design (Paperback, Softcover reprint of the original 1st ed. 1988): D.F. Wong, H.W. Leong, H. W.... Simulated Annealing for VLSI Design (Paperback, Softcover reprint of the original 1st ed. 1988)
D.F. Wong, H.W. Leong, H. W. Liu
R2,633 Discovery Miles 26 330 Ships in 18 - 22 working days

This monograph represents a summary of our work in the last two years in applying the method of simulated annealing to the solution of problems that arise in the physical design of VLSI circuits. Our study is experimental in nature, in that we are con cerned with issues such as solution representations, neighborhood structures, cost functions, approximation schemes, and so on, in order to obtain good design results in a reasonable amount of com putation time. We hope that our experiences with the techniques we employed, some of which indeed bear certain similarities for different problems, could be useful as hints and guides for other researchers in applying the method to the solution of other prob lems. Work reported in this monograph was partially supported by the National Science Foundation under grant MIP 87-03273, by the Semiconductor Research Corporation under contract 87-DP- 109, by a grant from the General Electric Company, and by a grant from the Sandia Laboratories."

Meta-Modeling - Performance and Information Modeling (Paperback, Softcover reprint of the original 1st ed. 1996): Jean-Michel... Meta-Modeling - Performance and Information Modeling (Paperback, Softcover reprint of the original 1st ed. 1996)
Jean-Michel Berge, Oz Levia, Jacques Rouillard
R2,634 Discovery Miles 26 340 Ships in 18 - 22 working days

Models in system design follow the general tendency in electronics in terms of size, complexity and difficulty of maintenance. While a model should be a manageable representation of a system, this increasing complexity sometimes forces current CAD-tool designers and model writers to apply modeling techniques to the model itself. Model writers are interested in instrumenting their model, so as to extract critical information before the model is complete. CAD tools designers use internal representations of the design at various stages. The complexity has also led CAD-tool developers to develop formal tools, theories and methods to improve relevance, completeness and consistency of those internal representations. Information modeling involves the representation of objects, their properties and relationships. Performance Modeling When it comes to design choices and trade-offs, performance is generally the final key. However performance estimations have to be extracted at a very early stage in the system design. Performance modeling concerns the set of tools and techniques that allow or help the designer to capture metrics relating to future architectures. Performance modeling encompasses the whole system, including software modeling. It has a strong impact on all levels of design choices, from hardware/software partitioning to the final layout. Information Modeling Specification and formalism have in the past traditionally played little part in the design and development of EDA systems, their support environments, languages and processes. Instead, EDA system developers and EDA system users have seemed to be content to operate within environments that are often extremely complex and may be poorly tested and understood. This situation has now begun to change with the increasing use of techniques drawn from the domains of formal specification and database design. This section of this volume addresses aspects of the techniques being used. In particular, it considers a specific formalism, called information modeling, which has gained increasing acceptance recently and is now a key part of many of the proposals in the EDA Standards Roadmap, which promises to be of significance to the EDA industry. In addition, the section looks at an example of a design system from the point of view of its underlying understanding of the design process rather than through a consideration of particular CAD algorithms. Meta-Modeling: Performance and Information Modeling contains papers describing the very latest techniques used in meta-modeling. It will be a valuable text for researchers, practitioners and students involved in Electronic Design Automation.

Principles of VLSI System Planning - A Framework for Conceptual Design (Paperback, Softcover reprint of the original 1st ed.... Principles of VLSI System Planning - A Framework for Conceptual Design (Paperback, Softcover reprint of the original 1st ed. 1990)
Allen M. Dewey, Stephen W. Director
R3,993 Discovery Miles 39 930 Ships in 18 - 22 working days

This book describes a new type of computer aided VLSI design tool, called a VLSI System Planning, that is meant to aid designers dur ing the early, or conceptual, state of design. During this stage of design, the objective is to define a general design plan, or approach, that is likely to result in an efficient implementation satisfying the initial specifications, or to determine that the initial specifications are not realizable. A design plan is a collection of high level design decisions. As an example, the conceptual design of digital filters involves choosing the type of algorithm to implement (e. g. , finite impulse response or infinite impulse response), the type of polyno mial approximation (e. g. , Equiripple or Chebyshev), the fabrication technology (e. g. , CMOS or BiCMOS), and so on. Once a particu lar design plan is chosen, the detailed design phase can begin. It is during this phase that various synthesis, simulation, layout, and test activities occur to refine the conceptual design, gradually filling more detail until the design is finally realized. The principal advantage of VLSI System Planning is that the increasingly expensive resources of the detailed design process are more efficiently managed. Costly redesigns are minimized because the detailed design process is guided by a more credible, consistent, and correct design plan.

Computer-aided Tolerancing - Proceedings of the 4th CIRP Design Seminar The University of Tokyo, Tokyo, Japan, April 5-6, 1995... Computer-aided Tolerancing - Proceedings of the 4th CIRP Design Seminar The University of Tokyo, Tokyo, Japan, April 5-6, 1995 (Paperback, Softcover reprint of the original 1st ed. 1996)
Fumihiko Kimura
R5,159 Discovery Miles 51 590 Ships in 18 - 22 working days

Theory and practice of tolerances are very important for designing and manufacturing engineering artifacts on a rational basis. Tolerance specifies a degree of "discrepancy" between an idealized object and its physical realization. Such discrepancy inevitably comes into our product realization processes because of practical cost consideration or our inability to fully control manufacturing processes. Major product and production characteristics which are affected by tolerances are product quality and cost. For achieving high precision machines tight tolerance specification is necessary, but this will normally increase product cost. In order to optimally compromise the conflicting requirements of quality and cost, it is essential to take into account of the total product life cycle throughout product planning, design, manufacturing, maintenance and recycling. For example, in order to construct durable products under severe working conditions, low sensitivity of product functionality with respect to tolerances is required. In future, re-use of components or parts will become important, and tolerance synthesis with respect to this aspect will be an interesting future research topics.

Testing and Reliable Design of CMOS Circuits (Paperback, Softcover reprint of the original 1st ed. 1990): Niraj K. Jha, Sandip... Testing and Reliable Design of CMOS Circuits (Paperback, Softcover reprint of the original 1st ed. 1990)
Niraj K. Jha, Sandip Kundu
R3,999 Discovery Miles 39 990 Ships in 18 - 22 working days

In the last few years CMOS technology has become increas ingly dominant for realizing Very Large Scale Integrated (VLSI) circuits. The popularity of this technology is due to its high den sity and low power requirement. The ability to realize very com plex circuits on a single chip has brought about a revolution in the world of electronics and computers. However, the rapid advance ments in this area pose many new problems in the area of testing. Testing has become a very time-consuming process. In order to ease the burden of testing, many schemes for designing the circuit for improved testability have been presented. These design for testability techniques have begun to catch the attention of chip manufacturers. The trend is towards placing increased emphasis on these techniques. Another byproduct of the increase in the complexity of chips is their higher susceptibility to faults. In order to take care of this problem, we need to build fault-tolerant systems. The area of fault-tolerant computing has steadily gained in importance. Today many universities offer courses in the areas of digital system testing and fault-tolerant computing. Due to the impor tance of CMOS technology, a significant portion of these courses may be devoted to CMOS testing. This book has been written as a reference text for such courses offered at the senior or graduate level. Familiarity with logic design and switching theory is assumed. The book should also prove to be useful to professionals working in the semiconductor industry."

High-Level System Modeling - Specification and Design Methodologies (Paperback, Softcover reprint of the original 1st ed.... High-Level System Modeling - Specification and Design Methodologies (Paperback, Softcover reprint of the original 1st ed. 1996)
Ronald Waxman, Jean-Michel Berge, Oz Levia, Jacques Rouillard
R3,992 Discovery Miles 39 920 Ships in 18 - 22 working days

In system design, generation of high-level abstract models that can be closely associated with evolving lower-level models provides designers with the ability to incrementally test' an evolving design against a model of a specification. Such high-level models may deal with areas such as performance, reliability, availability, maintainability, and system safety. Abstract models also allow exploration of the hardware versus software design space in an incremental fashion as a fuller, detailed design unfolds, leaving behind the old practice of hardware-software binding too early in the design process. Such models may also allow the inclusion of non-functional aspects of design (e.g. space, power, heat) in a simulatable information model dealing with the system's operation. This book addresses Model Generation and Application specifically in the following domains: Specification modeling (linking object/data modeling, behavior modeling, and activity modeling). Operational specification modeling (modeling the way the system is supposed to operate - from a user's viewpoint). Linking non-functional parameters with specification models. Hybrid modeling (linking performance and functional elements). Application of high-level modeling to hardware/software approaches. Mathematical analysis techniques related to the modeling approaches. Reliability modeling. Applications of High Level Modeling. Reducing High Level Modeling to Practice. High-Level System Modeling: Specification and Design Methodologies describes the latest research and practice in the modeling of electronic systems and as such is an important update for all researchers, design engineers and technical managers working in design automation and circuit design.

Introduction to Analog VLSI Design Automation (Paperback, Softcover reprint of the original 1st ed. 1990): Mohammed Ismail,... Introduction to Analog VLSI Design Automation (Paperback, Softcover reprint of the original 1st ed. 1990)
Mohammed Ismail, Jose E. Franca
R2,629 Discovery Miles 26 290 Ships in 18 - 22 working days

Very large scale integration (VLSI) technologies are now maturing with a current emphasis toward submicron structures and sophisticated applications combining digital as well as analog circuits on a single chip. Abundant examples are found on today's advanced systems for telecom munications, robotics, automotive electronics, image processing, intelli gent sensors, etc .. Exciting new applications are being unveiled in the field of neural computing where the massive use of analog/digital VLSI technologies will have a significant impact. To match such a fast technological trend towards single chip ana logi digital VLSI systems, researchers worldwide have long realized the vital need of producing advanced computer aided tools for designing both digital and analog circuits and systems for silicon integration. Ar chitecture and circuit compilation, device sizing and the layout genera tion are but a few familiar tasks on the world of digital integrated circuit design which can be efficiently accomplished by matured computer aided tools. In contrast, the art of tools for designing and producing analog or even analogi digital integrated circuits is quite primitive and still lack ing the industrial penetration and acceptance already achieved by digital counterparts. In fact, analog design is commonly perceived to be one of the most knowledge-intensive design tasks and analog circuits are still designed, largely by hand, by expert intimately familiar with nuances of the target application and integrated circuit fabrication process. The techniques needed to build good analog circuits seem to exist solely as expertise invested in individual designers."

Advances in Machine Vision (Paperback, Softcover reprint of the original 1st ed. 1989): Jorge L.C Sanz Advances in Machine Vision (Paperback, Softcover reprint of the original 1st ed. 1989)
Jorge L.C Sanz
R2,690 Discovery Miles 26 900 Ships in 18 - 22 working days

Machine Vision technology is becoming an indispensible part of the manufacturing industry. Biomedical and scientific applications of machine vision and imaging are becoming more and more sophisticated, and new applications continue to emerge. This book gives an overview of ongoing research in machine vision and presents the key issues of scientific and practical interest. A selected board of experts from the US, Japan and Europe provides an insight into some of the latest work done on machine vision systems and appliccations.

Object-Oriented Modeling (Paperback, Softcover reprint of the original 1st ed. 1996): Jean-Michel Berge, Oz Levia, Jacques... Object-Oriented Modeling (Paperback, Softcover reprint of the original 1st ed. 1996)
Jean-Michel Berge, Oz Levia, Jacques Rouillard
R5,113 Discovery Miles 51 130 Ships in 18 - 22 working days

Object-oriented techniques and languages have been proven to significantly increase engineering efficiency in software development. Many benefits are expected from their introduction into electronic modeling. Among them are better support for model reusability and flexibility, more efficient system modeling, and more possibilities in design space exploration and prototyping. Object-Oriented Modeling explores the latest techniques in object-oriented methods, formalisms and hardware description language extensions. The seven chapters comprising this book provide an overview of the latest object-oriented techniques for designing systems and hardware. Many examples are given in C++, VHDL and real-time programming languages. Object-Oriented Modeling describes further the use of object-oriented techniques in applications such as embedded systems, telecommunications and real-time systems, using the very latest techniques in object-oriented modeling. It is an essential guide to researchers, practitioners and students involved in software, hardware and system design.

Hierarchical Modeling for VLSI Circuit Testing (Paperback, Softcover reprint of the original 1st ed. 1990): Debashis... Hierarchical Modeling for VLSI Circuit Testing (Paperback, Softcover reprint of the original 1st ed. 1990)
Debashis Bhattacharya, John P. Hayes
R2,621 Discovery Miles 26 210 Ships in 18 - 22 working days

Test generation is one of the most difficult tasks facing the designer of complex VLSI-based digital systems. Much of this difficulty is attributable to the almost universal use in testing of low, gate-level circuit and fault models that predate integrated circuit technology. It is long been recognized that the testing prob lem can be alleviated by the use of higher-level methods in which multigate modules or cells are the primitive components in test generation; however, the development of such methods has proceeded very slowly. To be acceptable, high-level approaches should be applicable to most types of digital circuits, and should provide fault coverage comparable to that of traditional, low-level methods. The fault coverage problem has, perhaps, been the most intractable, due to continued reliance in the testing industry on the single stuck-line (SSL) fault model, which is tightly bound to the gate level of abstraction. This monograph presents a novel approach to solving the foregoing problem. It is based on the systematic use of multibit vectors rather than single bits to represent logic signals, including fault signals. A circuit is viewed as a collection of high-level components such as adders, multiplexers, and registers, interconnected by n-bit buses. To match this high-level circuit model, we introduce a high-level bus fault that, in effect, replaces a large number of SSL faults and allows them to be tested in parallel. However, by reducing the bus size from n to one, we can obtain the traditional gate-level circuit and models."

Hardware Component Modeling (Paperback, Softcover reprint of the original 1st ed. 1996): Jean-Michel Berge, Oz Levia, Jacques... Hardware Component Modeling (Paperback, Softcover reprint of the original 1st ed. 1996)
Jean-Michel Berge, Oz Levia, Jacques Rouillard
R2,618 Discovery Miles 26 180 Ships in 18 - 22 working days

Hardware Component Modeling highlights the current status of the modeling of electronic components. It includes contributions from many of the leading researchers and practitioners in the field. The contents focus on four important topics. Standards: Three chapters describe current developments in employing standards for the use of component libraries. A major part of these chapters provides an excellent introduction to VITAL (an IEEE standard), its application and some of the issues in using and implementing it. There are, however, other standards with a role to play and these are also covered. Data Types: One chapter describes the latest techniques for using data types in modeling and simulation. Model Generation: One chapter describes a model generator for reusable component models and another describes a generator which takes actual physical data as its source and generates a functional model. Quality Assurance: Two chapters are devoted to improving the quality of models. One introduces a method for quantifying aspects of model quality and the other introduces quality concepts which can lead to an increase in model value through reuse and robustness.Hardware Component Modeling is a valuable reference for researchers and practitioners involved in the process of modeling electronic components.

Synthesis of Power Distribution to Manage Signal Integrity in Mixed-Signal ICs (Paperback, Softcover reprint of the original... Synthesis of Power Distribution to Manage Signal Integrity in Mixed-Signal ICs (Paperback, Softcover reprint of the original 1st ed. 1996)
Balsha R. Stanisic, Rob A. Rutenbar, L.Richard Carley
R2,637 Discovery Miles 26 370 Ships in 18 - 22 working days

In the early days of VLSI, the design of the power distribution for an integrated cir cuit was rather simple. Power distribution --the design of the geometric topology for the network of wires that connect the various power supplies, the widths of the indi vidual segments for each of these wires, the number and location of the power I/O pins around the periphery of the chip --was simple because the chips were simpler. Few available wiring layers forced floorplans that allowed simple, planar (non-over lapping) power networks. Lower speeds and circuit density made the choice of the wire widths easier: we made them just fat enough to avoid resistive voltage drops due to switching currents in the supply network. And we just didn't need enormous num bers of power and ground pins on the package for the chips to work. It's not so simple any more. Increased integration has forced us to focus on reliability concerns such as metal elec tromigration, which affects wire sizing decisions in the power network. Extra metal layers have allowed more flexibility in the topological layout of the power networks."

Computer Methods for Analysis of Mixed-Mode Switching Circuits (Paperback, Softcover reprint of hardcover 1st ed. 2004): Fei... Computer Methods for Analysis of Mixed-Mode Switching Circuits (Paperback, Softcover reprint of hardcover 1st ed. 2004)
Fei Yuan, Ajoy Opal
R2,678 Discovery Miles 26 780 Ships in 18 - 22 working days

Computer Methods for Analysis of Mixed-Mode Switching Circuits provides an in-depth treatment of the principles and implementation details of computer methods and numerical algorithms for analysis of mixed-mode switching circuits. Major topics include:
-Computer-oriented formulation of mixed-mode switching circuits,
-Network functions of linear and nonlinear time-varying systems,
-Numerical Laplace inversion based integration algorithms and inconsistent initial conditions,
-Time domain analysis of periodically switched linear and nonlinear circuits including response, sensitivity, noise, clock jitter, and statistical quantities,
-Time domain analysis of circuits with internally controlled switches and over-sampled sigma-delta modulators,
-Tellegen's theorem, frequency reversal theorem, and transfer function theorem of periodically switched linear circuits and their applications,
-Frequency domain analysis of periodically switched linear and nonlinear circuits including response, sensitivity, group delay, noise, and statistical quantities.

Switching Networks: Recent Advances (Paperback, Softcover reprint of the original 1st ed. 2001): Dingzhu Du, Hung Q. Ngo Switching Networks: Recent Advances (Paperback, Softcover reprint of the original 1st ed. 2001)
Dingzhu Du, Hung Q. Ngo
R2,677 Discovery Miles 26 770 Ships in 18 - 22 working days

The switching net.work is an important. classic research area in t.ele- communication and comput.er net.works. It.s import.ancc st.ems from both theory and practice. In fact, some open problems, such as Benes conjec- ture on shuffle-exchange networks and Chung-Rmis conjecture on multi- rate rearrangeability, still attract many researchers and the further de- velopment in optical networks requires advanced technology in optical switching networks. In 1997, we had a workshop in switching networks held in NSF Sci- ence and Technology Center in Discrete Mathematics and Theoretical Computer Science (DIMACS), at Princeton University. This workshop was very successful. Many participants wished to have a similar activity every two or three years. This book is a result of such a wish. We are putting together SOllle important developments in this area during last. several years, including articles ill fault-tolerance, rearrang{~ability. non- blocking, optical networks. random permutation generat.ioll. and layout complexity. SOlllC of thos(~ art ides are research papers alld SOIllC an' sur- veys. All articles were reviewed. We would like to uWlItioll two special problems studied in those articles.

Curves and Surfaces for Computer Graphics (Paperback, 2006): David Salomon Curves and Surfaces for Computer Graphics (Paperback, 2006)
David Salomon
R2,281 Discovery Miles 22 810 Ships in 18 - 22 working days

Computer graphics is important in many areas including engineering design, architecture, education, and computer art and animation. This book examines a wide array of current methods used in creating real-looking objects in the computer, one of the main aims of computer graphics.

Key features:

* Good foundational mathematical introduction to curves and surfaces; no advanced math required

* Topics organized by different interpolation/approximation techniques, each technique providing useful information about curves and surfaces

* Exposition motivated by numerous examples and exercises sprinkled throughout, aiding the reader

* Includes a gallery of color images, Mathematica code listings, and sections on curves & surfaces by refinement and on sweep surfaces

* Web site maintained and updated by the author, providing readers with errata and auxiliary material

This engaging text is geared to a broad and general readership of computer science/architecture engineers using computer graphics to design objects, programmers for computer gamemakers, applied mathematicians, and students majoring in computer graphics and its applications. It may be used in a classroom setting or as a general reference.

Digital Architecture Beyond Computers - Fragments of a Cultural History of Computational Design (Hardcover): Roberto Bottazzi Digital Architecture Beyond Computers - Fragments of a Cultural History of Computational Design (Hardcover)
Roberto Bottazzi
R3,184 Discovery Miles 31 840 Ships in 10 - 15 working days

Digital Architecture Beyond Computers explores the deep history of digital architecture, tracing design concepts as far back as the Renaissance and connecting them with the latest software used by designers today. It develops a critical account of how the tools and techniques of digital design have emerged, and allows designers to deepen their understanding of the digital tools they use every day. What aesthetic, spatial, and philosophical concepts converge within the digital tools architects employ? What is their history? And what kinds of techniques and designs have they given rise to? This book explores the answers to these questions, showing how digital architecture brings together complex ideas and trajectories which span across several domains and have evolved over many centuries. It sets out to unpack these ideas, trace their origin and permeation into architecture, and re-examine their use in contemporary software. Chapters are arranged around the histories of nine 'fragments' - each a fundamental concept embedded in popular CAD applications: database, layers and fields, parametrics, pixel, programme, randomness, scanning, topology, and voxel/maxel - with each theme examined through a series of historical and contemporary case studies. The book thus connects the digital design process with architectural history and theory, allowing designers and theorists alike to develop more analytical and critical tools with which to conceptualise digital design and its software.

Formal Methods for Embedded Distributed Systems - How to master the complexity (Paperback, Softcover reprint of hardcover 1st... Formal Methods for Embedded Distributed Systems - How to master the complexity (Paperback, Softcover reprint of hardcover 1st ed. 2004)
Fabrice Kordon, Michel Lemoine
R2,653 Discovery Miles 26 530 Ships in 18 - 22 working days

The development of any Software (Industrial) Intensive System, e.g. critical embedded software, requires both different notations, and a strong devel- ment process. Different notations are mandatory because different aspects of the Software System have to be tackled. A strong development process is mandatory as well because without a strong organization we cannot warrantee the system will meet its requirements. Unfortunately, much more is needed! The different notations that can be used must all possess at least one property: formality. The development process must also have important properties: a exha- tive coverage of the development phases, and a set of well integrated support tools. In Computer Science it is now widely accepted that only formal notations can guarantee a perfect de?ned meaning. This becomes a more and more important issue since software systems tend to be distributed in large systems (for instance in safe public transportation systems), and in small ones (for instance numerous processors in luxury cars). Distribution increases the complexity of embedded software while safety criteria get harder to be met. On the other hand, during the past decade Software Engineering techniques have been improved a lot, and are now currently used to conduct systematic and rigorous development of large software systems. UML has become the de facto standard notation for documenting Software Engineering projects. UML is supported by many CASE tools that offer graphical means for the UML notation.

New Advances in Intelligent Decision Technologies - Results of the First KES International Symposium IDT'09 (Paperback,... New Advances in Intelligent Decision Technologies - Results of the First KES International Symposium IDT'09 (Paperback, Softcover reprint of hardcover 1st ed. 2009)
Gloria Phillips-Wren
R5,245 Discovery Miles 52 450 Ships in 18 - 22 working days

IDT (Intelligent Decision Technologies) seeks an interchange of research on intelligent systems and intelligent technologies which enhance or improve decision making in industry, government and academia. The focus is interdisciplinary in nature, and includes research on all aspects of intelligent decision technologies, from fundamental development to the applied system.

It constitutes a great honor and pleasure for us to publish the works and new research results of scholars from the First KES International Symposium on Intelligent Decision Technologies (KES IDT'09), hosted and organized by University of Hyogo in conjunction with KES International (Himeji, Japan, April, 2009). The symposium was concerned with theory, design, development, implementation, testing and evaluation of intelligent decision systems. Its topics included intelligent agents, fuzzy logic, multi-agent systems, artificial neural networks, genetic algorithms, expert systems, intelligent decision making support systems, information retrieval systems, geographic information systems, and knowledge management systems. These technologies have the potential to support decision making in many areas of management, international business, finance, accounting, marketing, healthcare, military applications, production, networks, traffic management, crisis response, and human interfaces.

Classification and Clustering for Knowledge Discovery (Paperback, Softcover reprint of hardcover 1st ed. 2005): Saman K.... Classification and Clustering for Knowledge Discovery (Paperback, Softcover reprint of hardcover 1st ed. 2005)
Saman K. Halgamuge, Lipo Wang
R4,031 Discovery Miles 40 310 Ships in 18 - 22 working days

Knowledge Discovery today is a significant study and research area. In finding answers to many research questions in this area, the ultimate hope is that knowledge can be extracted from various forms of data around us. This book covers recent advances in unsupervised and supervised data analysis methods in Computational Intelligence for knowledge discovery. In its first part the book provides a collection of recent research on distributed clustering, self organizing maps and their recent extensions. If labeled data or data with known associations are available, we may be able to use supervised data analysis methods, such as classifying neural networks, fuzzy rule-based classifiers, and decision trees. Therefore this book presents a collection of important methods of supervised data analysis. "Classification and Clustering for Knowledge Discovery" also includes variety of applications of knowledge discovery in health, safety, commerce, mechatronics, sensor networks, and telecommunications.

Visualization of Digital Terrain and Landscape Data - A Manual (Paperback, Softcover reprint of hardcover 1st ed. 2007):... Visualization of Digital Terrain and Landscape Data - A Manual (Paperback, Softcover reprint of hardcover 1st ed. 2007)
Rudiger Mach, Peter Petschek
R2,678 Discovery Miles 26 780 Ships in 18 - 22 working days

This book approaches the realisation of digital terrain and landscape data through clear and practical examples. From data provision and the creation of revealing analyses to realistic depictions for presentation purposes, the reader is led through the world of digital 3-D graphics. The authors deep knowledge of the scientific fundamentals and many years of experience in 3-D visualization enable them to lead the reader through a complex subject and shed light on previously murky virtual landscapes.

Rough Set Theory: A True Landmark in Data Analysis (Paperback, Softcover reprint of hardcover 1st ed. 2009): Ajith Abraham,... Rough Set Theory: A True Landmark in Data Analysis (Paperback, Softcover reprint of hardcover 1st ed. 2009)
Ajith Abraham, Rafael Falcon, Rafael Bello
R4,024 Discovery Miles 40 240 Ships in 18 - 22 working days

Along the years, rough set theory has earned a well-deserved reputation as a sound methodology for dealing with imperfect knowledge in a simple though mathematically sound way. This edited volume aims at continue stressing the benefits of applying rough sets in many real-life situations while still keeping an eye on topological aspects of the theory as well as strengthening its linkage with other soft computing paradigms. The volume comprises 11 chapters and is organized into three parts. Part 1 deals with theoretical contributions while Parts 2 and 3 focus on several real world data mining applications. Chapters authored by pioneers were selected on the basis of fundamental ideas/concepts rather than the thoroughness of techniques deployed. Academics, scientists as well as engineers working in the rough set, computational intelligence, soft computing and data mining research area will find the comprehensive coverage of this book invaluable.

Modeling Uncertainty with Fuzzy Logic - With Recent Theory and Applications (Paperback, Softcover reprint of hardcover 1st ed.... Modeling Uncertainty with Fuzzy Logic - With Recent Theory and Applications (Paperback, Softcover reprint of hardcover 1st ed. 2009)
Asli Celikyilmaz, I. Burhan Turksen
R4,053 Discovery Miles 40 530 Ships in 18 - 22 working days

The world we live in is pervaded with uncertainty and imprecision. Is it likely to rain this afternoon? Should I take an umbrella with me? Will I be able to find parking near the campus? Should I go by bus? Such simple questions are a c- mon occurrence in our daily lives. Less simple examples: What is the probability that the price of oil will rise sharply in the near future? Should I buy Chevron stock? What are the chances that a bailout of GM, Ford and Chrysler will not s- ceed? What will be the consequences? Note that the examples in question involve both uncertainty and imprecision. In the real world, this is the norm rather than exception. There is a deep-seated tradition in science of employing probability theory, and only probability theory, to deal with uncertainty and imprecision. The mon- oly of probability theory came to an end when fuzzy logic made its debut. H- ever, this is by no means a widely accepted view. The belief persists, especially within the probability community, that probability theory is all that is needed to deal with uncertainty. To quote a prominent Bayesian, Professor Dennis Lindley, "The only satisfactory description of uncertainty is probability.

Design of Dependable Computing Systems (Paperback, Softcover reprint of the original 1st ed. 2002): J. C. Geffroy, G. Motet Design of Dependable Computing Systems (Paperback, Softcover reprint of the original 1st ed. 2002)
J. C. Geffroy, G. Motet
R1,516 Discovery Miles 15 160 Ships in 18 - 22 working days

This book analyzes the causes of failures in computing systems, their consequences, as weIl as the existing solutions to manage them. The domain is tackled in a progressive and educational manner with two objectives: 1. The mastering of the basics of dependability domain at system level, that is to say independently ofthe technology used (hardware or software) and of the domain of application. 2. The understanding of the fundamental techniques available to prevent, to remove, to tolerate, and to forecast faults in hardware and software technologies. The first objective leads to the presentation of the general problem, the fault models and degradation mechanisms wh ich are at the origin of the failures, and finally the methods and techniques which permit the faults to be prevented, removed or tolerated. This study concerns logical systems in general, independently of the hardware and software technologies put in place. This knowledge is indispensable for two reasons: * A large part of a product' s development is independent of the technological means (expression of requirements, specification and most of the design stage). Very often, the development team does not possess this basic knowledge; hence, the dependability requirements are considered uniquely during the technological implementation. Such an approach is expensive and inefficient. Indeed, the removal of a preliminary design fault can be very difficult (if possible) if this fault is detected during the product's final testing.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
A Kiss at Mistletoe
Tamara Gill Paperback R268 Discovery Miles 2 680
The Viscount Who Loved Me
Julia Quinn Paperback  (1)
R250 Discovery Miles 2 500
The Butterfly Room
Lucinda Riley Paperback  (1)
R299 R271 Discovery Miles 2 710
Flying Angels
Danielle Steel Paperback R404 Discovery Miles 4 040
To Kiss a Highland Rose
Tamara Gill Paperback R269 Discovery Miles 2 690
From Lackey to Lady
Chrisi E. Da Pavlova Paperback R295 Discovery Miles 2 950
The Pearl Sister
Lucinda Riley Paperback  (1)
R299 R271 Discovery Miles 2 710
Love Conquers War
Barbara Cartland Paperback  (1)
R287 Discovery Miles 2 870
Rheinsberg. a Story Book for Lovers…
Kurt Tucholsky Hardcover R516 Discovery Miles 5 160
Atlas - The Story Of Pa Salt
Lucinda Riley, Harry Whittaker Paperback R399 R166 Discovery Miles 1 660

 

Partners