0
Your cart

Your cart is empty

Browse All Departments
Price
  • R50 - R100 (1)
  • R100 - R250 (24)
  • R250 - R500 (85)
  • R500+ (292)
  • -
Status
Format
Author / Contributor
Publisher

Books > Professional & Technical > General

Zygmunt Zawirski - His Life and Work - with Selected Writings on Time, Logic and the Methodology of Science (Hardcover): Irena... Zygmunt Zawirski - His Life and Work - with Selected Writings on Time, Logic and the Methodology of Science (Hardcover)
Irena Szumilewicz-Lachman, Robert S. Cohen; Translated by Feliks Lachman
R2,678 Discovery Miles 26 780 Ships in 10 - 15 working days

Zygmunt Zawirski (1882-1948), an eminent and original Polish philosopher, belonged to the Lwow-Warsaw School (LWS) which left an indelible trace in logic, semiotics and philosophy of science. LWS was founded in 1895 by K. Twardowski, a disciple of Brentano, in the spirit of clarity, realism and analytic philosophy. LWS was more than 25 years older than the Vienna Circle (VC). This belies, inter alia, the not infrequently repeated statement that LWS was one of the many centres initiated by VC. The achievements of LWS in logic are well recognized, while those relating to philosophy of science are almost unknown. It is in order to fill this gap that some fragments of Zawirski's papers are presented, dealing mainly with causality, determinism, indeterminism and philosophical implications of relativity and quantum mechanics. His magnum opus "L'Evolution de la Notion du Temps" (Eugenio Rignano Prize, 1933) is devoted to time. Zawirski took into account all the issues which are at present widely discussed. The real value of these achievements can be understood better today than by his contemporaries. This text is suitable for all those interested in philosophy of science and philosophy, and history of ideas.

Digital Timing Macromodeling for VLSI Design Verification (Hardcover, 1995 ed.): Jeong-Taek Kong, David V. Overhauser Digital Timing Macromodeling for VLSI Design Verification (Hardcover, 1995 ed.)
Jeong-Taek Kong, David V. Overhauser
R4,552 Discovery Miles 45 520 Ships in 10 - 15 working days

Digital Timing Macromodeling for VLSI Design Verification first of all provides an extensive history of the development of simulation techniques. It presents detailed discussion of the various techniques implemented in circuit, timing, fast-timing, switch-level timing, switch-level, and gate-level simulation. It also discusses mixed-mode simulation and interconnection analysis methods. The review in Chapter 2 gives an understanding of the advantages and disadvantages of the many techniques applied in modern digital macromodels. The book also presents a wide variety of techniques for performing nonlinear macromodeling of digital MOS subcircuits which address a large number of shortcomings in existing digital MOS macromodels. Specifically, the techniques address the device model detail, transistor coupling capacitance, effective channel length modulation, series transistor reduction, effective transconductance, input terminal dependence, gate parasitic capacitance, the body effect, the impact of parasitic RC-interconnects, and the effect of transmission gates. The techniques address major sources of errors in existing macromodeling techniques, which must be addressed if macromodeling is to be accepted in commercial CAD tools by chip designers. The techniques presented in Chapters 4-6 can be implemented in other macromodels, and are demonstrated using the macromodel presented in Chapter 3. The new techniques are validated over an extremely wide range of operating conditions: much wider than has been presented for previous macromodels, thus demonstrating the wide range of applicability of these techniques.

VHDL '92 - The New Features of the VHDL Hardware Description Language (Hardcover): Jean-Michel Berge, Etc, Alain Fonkoua,... VHDL '92 - The New Features of the VHDL Hardware Description Language (Hardcover)
Jean-Michel Berge, Etc, Alain Fonkoua, Serge Maginot, Jacques Rouillard
R2,642 Discovery Miles 26 420 Ships in 10 - 15 working days

An open process of restandardization, conducted by the IEEE, has led to the definitions of the new VHDL standard. The changes make VHDL safer, more portable, and more powerful. VHDL also becomes bigger and more complete. The canonical simulator of VHDL is enriched by new mechanisms, the predefined environment is more complete, and the syntax is more regular and flexible. Discrepancies and known bugs of VHDL'87 have been fixed. However, the new VHDL'92 is compatible with VHDL'87, with some minor exceptions. This book presents the new VHDL'92 for the VHDL designer. New features are explained and classified. Examples are provided, each new feature is given a rationale and its impact on design methodology, and performance is analyzed. Where appropriate, pitfalls and traps are explained. The VHDL designer should quickly be able to find the feature needed to evaluate the benefits it brings, to modify previous VHDL'87 code to make it more efficient, more portable, and more flexible. This text should be a useful update for all VHDL designers and managers involved in electronic design.

A Formal Approach to Hardware Design (Hardcover, 1994 ed.): Jorgen Staunstrup A Formal Approach to Hardware Design (Hardcover, 1994 ed.)
Jorgen Staunstrup
R4,527 Discovery Miles 45 270 Ships in 10 - 15 working days

A Formal Approach to Hardware Design discusses designing computations to be realised by application specific hardware. It introduces a formal design approach based on a high-level design language called Synchronized Transitions. The models created using Synchronized Transitions enable the designer to perform different kinds of analysis and verification based on descriptions in a single language. It is, for example, possible to use exactly the same design description both for mechanically supported verification and synthesis. Synchronized Transitions is supported by a collection of public domain CAD tools. These tools can be used with the book in presenting a course on the subject. A Formal Approach to Hardware Design illustrates the benefits to be gained from adopting such techniques, but it does so without assuming prior knowledge of formal design methods. The book is thus not only an excellent reference, it is also suitable for use by students and practitioners.

Circuit Synthesis with VHDL (Hardcover, 1994 ed.): Roland Airiau, Jean-Michel Berge, Vincent Olive Circuit Synthesis with VHDL (Hardcover, 1994 ed.)
Roland Airiau, Jean-Michel Berge, Vincent Olive
R4,521 Discovery Miles 45 210 Ships in 10 - 15 working days

One of the main applications of VHDL is the synthesis of electronic circuits. Circuit Synthesis with VHDL is an introduction to the use of VHDL logic (RTL) synthesis tools in circuit design. The modeling styles proposed are independent of specific market tools and focus on constructs widely recognized as synthesizable by synthesis tools. A statement of the prerequisites for synthesis is followed by a short introduction to the VHDL concepts used in synthesis. Circuit Synthesis with VHDL presents two possible approaches to synthesis: the first starts with VHDL features and derives hardware counterparts; the second starts from a given hardware component and derives several description styles. The book also describes how to introduce the synthesis design cycle into existing design methodologies and the standard synthesis environment. Circuit Synthesis with VHDL concludes with a case study providing a realistic example of the design flow from behavioral description down to the synthesized level. Circuit Synthesis with VHDL is essential reading for all students, researchers, design engineers and managers working with VHDL in a synthesis environment.

Fundamentals and Standards in Hardware Description Languages (Hardcover, 1993 ed.): Jean Mermet Fundamentals and Standards in Hardware Description Languages (Hardcover, 1993 ed.)
Jean Mermet
R8,646 Discovery Miles 86 460 Ships in 10 - 15 working days

The second half of this century will remain as the era of proliferation of electronic computers. They did exist before, but they were mechanical. During next century they may perform other mutations to become optical or molecular or even biological. Actually, all these aspects are only fancy dresses put on mathematical machines. This was always recognized to be true in the domain of software, where "machine" or "high level" languages are more or less rigourous, but immaterial, variations of the universaly accepted mathematical language aimed at specifying elementary operations, functions, algorithms and processes. But even a mathematical machine needs a physical support, and this is what hardware is all about. The invention of hardware description languages (HDL's) in the early 60's, was an attempt to stay longer at an abstract level in the design process and to push the stage of physical implementation up to the moment when no more technology independant decisions can be taken. It was also an answer to the continuous, exponential growth of complexity of systems to be designed. This problem is common to hardware and software and may explain why the syntax of hardware description languages has followed, with a reasonable delay of ten years, the evolution of the programming languages: at the end of the 60's they were" Algol like" , a decade later "Pascal like" and now they are "C or ADA-like". They have also integrated the new concepts of advanced software specification languages.

Application-Driven Architecture Synthesis (Hardcover, 1993 ed.): Francky Catthoor, Lars-Gunnar Svensson Application-Driven Architecture Synthesis (Hardcover, 1993 ed.)
Francky Catthoor, Lars-Gunnar Svensson
R4,396 Discovery Miles 43 960 Ships in 10 - 15 working days

Application-Driven Architecture Synthesis describes the state of the art of architectural synthesis for complex real-time processing. In order to deal with the stringent timing requirements and the intricacies of complex real-time signal and data processing, target architecture styles and target application domains have been adopted to make the synthesis approach feasible. These approaches are also heavily application-driven, which is illustrated by many realistic demonstrations, used as examples in the book. The focus is on domains where application-specific solutions are attractive, such as significant parts of audio, telecom, instrumentation, speech, robotics, medical and automotive processing, image and video processing, TV, multi-media, radar, sonar. Application-Driven Architecture Synthesis is of interest to both academics and senior design engineers and CAD managers in industry. It provides an excellent overview of what capabilities to expect from future practical design tools, and includes an extensive bibliography.

Field-Programmable Gate Arrays (Hardcover, 1992 ed.): Stephen D. Brown, Robert J. Francis, Jonathan Rose, Zvonko G. Vranesic Field-Programmable Gate Arrays (Hardcover, 1992 ed.)
Stephen D. Brown, Robert J. Francis, Jonathan Rose, Zvonko G. Vranesic
R5,753 Discovery Miles 57 530 Ships in 10 - 15 working days

Field-Programmable Gate Arrays (FPGAs) have emerged as an attractive means of implementing logic circuits, providing instant manufacturing turnaround and negligible prototype costs. They hold the promise of replacing much of the VLSI market now held by mask-programmed gate arrays. FPGAs offer an affordable solution for customized VLSI, over a wide variety of applications, and have also opened up new possibilities in designing reconfigurable digital systems. Field-Programmable Gate Arrays discusses the most important aspects of FPGAs in a textbook manner. It provides the reader with a focused view of the key issues, using a consistent notation and style of presentation. It provides detailed descriptions of commercially available FPGAs and an in-depth treatment of the FPGA architecture and CAD issues that are the subjects of current research. The material presented is of interest to a variety of readers, including those who are not familiar with FPGA technology, but wish to be introduced to it, as well as those who already have an understanding of FPGAs, but who are interested in learning about the research directions that are of current interest.

Risk and Society: The Interaction of Science, Technology and Public Policy (Hardcover, 1992 ed.): M. Waterstone Risk and Society: The Interaction of Science, Technology and Public Policy (Hardcover, 1992 ed.)
M. Waterstone
R4,496 Discovery Miles 44 960 Ships in 10 - 15 working days

Life in the last quarter of the twentieth century presents a baffling array of complex issues. The benefits of technology are arrayed against the risks and hazards of those same technological marvels (frequently, though not always, arising as side effects or by-products). This confrontation poses very difficult choices for individuals as well as for those charged with making public policy. Some of the most challenging of these issues result because of the ability of technological innovation and deployment to outpace the capacity of institutions to assess and evaluate implications. In many areas, the rate of technological advance has now far outstripped the capabilities of institutional monitoring and control. While there are many instances in which technological advance occurs without adverse consequences (and in fact, yields tremendous benefits), frequently the advent of a major innovation brings a wide array of unforeseen and (to some) undesirable effects. This problem is exacerbated as the interval between the initial development of a technology and its deployment is shortened, since the opportunity for cautious appraisal is decreased.

Assessing Fault Model and Test Quality (Hardcover, 1992 ed.): Kenneth M. Butler, M.Ray Mercer Assessing Fault Model and Test Quality (Hardcover, 1992 ed.)
Kenneth M. Butler, M.Ray Mercer
R2,975 Discovery Miles 29 750 Ships in 10 - 15 working days

For many years, the dominant fault model in automatic test pattern gen eration (ATPG) for digital integrated circuits has been the stuck-at fault model. The static nature of stuck-at fault testing when compared to the extremely dynamic nature of integrated circuit (IC) technology has caused many to question whether or not stuck-at fault based testing is still viable. Attempts at answering this question have not been wholly satisfying due to a lack of true quantification, statistical significance, and/or high computational expense. In this monograph we introduce a methodology to address the ques tion in a manner which circumvents the drawbacks of previous approaches. The method is based on symbolic Boolean functional analyses using Or dered Binary Decision Diagrams (OBDDs). OBDDs have been conjectured to be an attractive representation form for Boolean functions, although cases ex ist for which their complexity is guaranteed to grow exponentially with input cardinality. Classes of Boolean functions which exploit the efficiencies inherent in OBDDs to a very great extent are examined in Chapter 7. Exact equa tions giving their OBDD sizes are derived, whereas until very recently only size bounds have been available. These size equations suggest that straight forward applications of OBDDs to design and test related problems may not prove as fruitful as was once thought."

Europe, America, and Technology: Philosophical Perspectives (Hardcover, 1991 ed.): P. T. Durbin Europe, America, and Technology: Philosophical Perspectives (Hardcover, 1991 ed.)
P. T. Durbin
R4,542 Discovery Miles 45 420 Ships in 10 - 15 working days

As Europe moves toward 1992 and full economic unity, and as Eastern Europe tries to find its way in the new economic order, the United States hesitates. Will the new European economic order be good for the U.S. or not? Such a question is exacerbated by world-wide changes in the technological order, most evident in Japan's new techno-economic power. As might be expected, philosophers have been slow to come to grips with such issues, and lack of interest is compounded by different philosophical styles in different parts of the world. What this volume addresses is more a matter of conflicting styles than a substantive confrontation with the real-world issues. But there is some attempt to be concrete. The symposium on Ivan Illich - with contributions from philosophers and social critics at the Penns- vania State University, where Illich has taught for several years - may suggest the old cliche of Old World vs. New World. Illich's fulminations against technology are often dismissed by Americans as old-world-style prophecy, while Illich seems largely unknown in his native Europe. But Albert Borgmann, born in Germany though now settled in the U.S., shows that this old dichotomy is difficult to maintain in our technological world. Borgmann's focus is on urgent technological problems that have become almost painfully evident in both Europe and America.

Neural Models and Algorithms for Digital Testing (Hardcover, 1991 ed.): S. T. Chadradhar, Vishwani Agrawal, M. Bushnell Neural Models and Algorithms for Digital Testing (Hardcover, 1991 ed.)
S. T. Chadradhar, Vishwani Agrawal, M. Bushnell
R3,006 Discovery Miles 30 060 Ships in 10 - 15 working days

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 9 QUADRATIC 0-1 PROGRAMMING 8S 9. 1 Energy Minimization 86 9. 2 Notation and Tenninology . . . . . . . . . . . . . . . . . 87 9. 3 Minimization Technique . . . . . . . . . . . . . . . . . . 88 9. 4 An Example . . . . . . . . . . . . . . . . . . . . . . . . 92 9. 5 Accelerated Energy Minimization. . . . . . . . . . . . . 94 9. 5. 1 Transitive Oosure . . . . . . . . . . . . . . . . . 94 9. 5. 2 Additional Pairwise Relationships 96 9. 5. 3 Path Sensitization . . . . . . . . . . . . . . . . . 97 9. 6 Experimental Results 98 9. 7 Summary. . . . . . . . . . . . . . . . . . . . . . . . . . 100 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 10 TRANSITIVE CLOSURE AND TESTING 103 10. 1 Background . . . . . . . . . . . . . . . . . . . . . . . . 104 10. 2 Transitive Oosure Definition 105 10. 3 Implication Graphs 106 10. 4 A Test Generation Algorithm 107 10. 5 Identifying Necessary Assignments 112 10. 5. 1 Implicit Implication and Justification 113 10. 5. 2 Transitive Oosure Does More Than Implication and Justification 115 10. 5. 3 Implicit Sensitization of Dominators 116 10. 5. 4 Redundancy Identification 117 10. 6 Summary 119 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 11 POLYNOMIAL-TIME TESTABILITY 123 11. 1 Background 124 11. 1. 1 Fujiwara's Result 125 11. 1. 2 Contribution of the Present Work . . . . . . . . . 126 11. 2 Notation and Tenninology 127 11. 3 A Polynomial TlDle Algorithm 128 11. 3. 1 Primary Output Fault 129 11. 3. 2 Arbitrary Single Fault 135 11. 3. 3 Multiple Faults. . . . . . . . . . . . . . . . . . . 137 11. 4 Summary. . . . . . . . . . . . . . . . . . . . . . . . . . 139 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 ix 12 SPECIAL CASES OF HARD PROBLEMS 141 12. 1 Problem Statement 142 12. 2 Logic Simulation 143 12. 3 Logic Circuit Modeling . 146 12. 3. 1 Modelfor a Boolean Gate . . . . . . . . . . . . . 147 12. 3. 2 Circuit Modeling 148 12.

Integrating Functional and Temporal Domains in Logic Design - The False Path Problem and Its Implications (Hardcover, 1991... Integrating Functional and Temporal Domains in Logic Design - The False Path Problem and Its Implications (Hardcover, 1991 ed.)
Patrick C. McGeer, Robert K. Brayton
R3,031 Discovery Miles 30 310 Ships in 10 - 15 working days

This book is an extension of one author's doctoral thesis on the false path problem. The work was begun with the idea of systematizing the various solutions to the false path problem that had been proposed in the literature, with a view to determining the computational expense of each versus the gain in accuracy. However, it became clear that some of the proposed approaches in the literature were wrong in that they under estimated the critical delay of some circuits under reasonable conditions. Further, some other approaches were vague and so of questionable accu racy. The focus of the research therefore shifted to establishing a theory (the viability theory) and algorithms which could be guaranteed correct, and then using this theory to justify (or not) existing approaches. Our quest was successful enough to justify presenting the full details in a book. After it was discovered that some existing approaches were wrong, it became apparent that the root of the difficulties lay in the attempts to balance computational efficiency and accuracy by separating the tempo ral and logical (or functional) behaviour of combinational circuits. This separation is the fruit of several unstated assumptions; first, that one can ignore the logical relationships of wires in a network when considering timing behaviour, and, second, that one can ignore timing considerations when attempting to discover the values of wires in a circuit."

Principles of VLSI System Planning - A Framework for Conceptual Design (Hardcover, 1990 ed.): Allen M. Dewey, Stephen W.... Principles of VLSI System Planning - A Framework for Conceptual Design (Hardcover, 1990 ed.)
Allen M. Dewey, Stephen W. Director
R4,386 Discovery Miles 43 860 Ships in 10 - 15 working days

This book describes a new type of computer aided VLSI design tool, called a VLSI System Planning, that is meant to aid designers dur ing the early, or conceptual, state of design. During this stage of design, the objective is to define a general design plan, or approach, that is likely to result in an efficient implementation satisfying the initial specifications, or to determine that the initial specifications are not realizable. A design plan is a collection of high level design decisions. As an example, the conceptual design of digital filters involves choosing the type of algorithm to implement (e. g. , finite impulse response or infinite impulse response), the type of polyno mial approximation (e. g. , Equiripple or Chebyshev), the fabrication technology (e. g. , CMOS or BiCMOS), and so on. Once a particu lar design plan is chosen, the detailed design phase can begin. It is during this phase that various synthesis, simulation, layout, and test activities occur to refine the conceptual design, gradually filling more detail until the design is finally realized. The principal advantage of VLSI System Planning is that the increasingly expensive resources of the detailed design process are more efficiently managed. Costly redesigns are minimized because the detailed design process is guided by a more credible, consistent, and correct design plan.

Introduction to Analog VLSI Design Automation (Hardcover, 1990 ed.): Mohammed Ismail, Jose E. Franca Introduction to Analog VLSI Design Automation (Hardcover, 1990 ed.)
Mohammed Ismail, Jose E. Franca
R3,006 Discovery Miles 30 060 Ships in 10 - 15 working days

Very large scale integration (VLSI) technologies are now maturing with a current emphasis toward submicron structures and sophisticated applications combining digital as well as analog circuits on a single chip. Abundant examples are found on today's advanced systems for telecom munications, robotics, automotive electronics, image processing, intelli gent sensors, etc .. Exciting new applications are being unveiled in the field of neural computing where the massive use of analog/digital VLSI technologies will have a significant impact. To match such a fast technological trend towards single chip ana logi digital VLSI systems, researchers worldwide have long realized the vital need of producing advanced computer aided tools for designing both digital and analog circuits and systems for silicon integration. Ar chitecture and circuit compilation, device sizing and the layout genera tion are but a few familiar tasks on the world of digital integrated circuit design which can be efficiently accomplished by matured computer aided tools. In contrast, the art of tools for designing and producing analog or even analogi digital integrated circuits is quite primitive and still lack ing the industrial penetration and acceptance already achieved by digital counterparts. In fact, analog design is commonly perceived to be one of the most knowledge-intensive design tasks and analog circuits are still designed, largely by hand, by expert intimately familiar with nuances of the target application and integrated circuit fabrication process. The techniques needed to build good analog circuits seem to exist solely as expertise invested in individual designers."

Hierarchical Modeling for VLSI Circuit Testing (Hardcover, 1990 ed.): Debashis Bhattacharya, John P. Hayes Hierarchical Modeling for VLSI Circuit Testing (Hardcover, 1990 ed.)
Debashis Bhattacharya, John P. Hayes
R2,987 Discovery Miles 29 870 Ships in 10 - 15 working days

Test generation is one of the most difficult tasks facing the designer of complex VLSI-based digital systems. Much of this difficulty is attributable to the almost universal use in testing of low, gate-level circuit and fault models that predate integrated circuit technology. It is long been recognized that the testing prob lem can be alleviated by the use of higher-level methods in which multigate modules or cells are the primitive components in test generation; however, the development of such methods has proceeded very slowly. To be acceptable, high-level approaches should be applicable to most types of digital circuits, and should provide fault coverage comparable to that of traditional, low-level methods. The fault coverage problem has, perhaps, been the most intractable, due to continued reliance in the testing industry on the single stuck-line (SSL) fault model, which is tightly bound to the gate level of abstraction. This monograph presents a novel approach to solving the foregoing problem. It is based on the systematic use of multibit vectors rather than single bits to represent logic signals, including fault signals. A circuit is viewed as a collection of high-level components such as adders, multiplexers, and registers, interconnected by n-bit buses. To match this high-level circuit model, we introduce a high-level bus fault that, in effect, replaces a large number of SSL faults and allows them to be tested in parallel. However, by reducing the bus size from n to one, we can obtain the traditional gate-level circuit and models."

Testing and Reliable Design of CMOS Circuits (Hardcover, 1990 ed.): Niraj K. Jha, Sandip Kundu Testing and Reliable Design of CMOS Circuits (Hardcover, 1990 ed.)
Niraj K. Jha, Sandip Kundu
R4,531 Discovery Miles 45 310 Ships in 10 - 15 working days

In the last few years CMOS technology has become increas ingly dominant for realizing Very Large Scale Integrated (VLSI) circuits. The popularity of this technology is due to its high den sity and low power requirement. The ability to realize very com plex circuits on a single chip has brought about a revolution in the world of electronics and computers. However, the rapid advance ments in this area pose many new problems in the area of testing. Testing has become a very time-consuming process. In order to ease the burden of testing, many schemes for designing the circuit for improved testability have been presented. These design for testability techniques have begun to catch the attention of chip manufacturers. The trend is towards placing increased emphasis on these techniques. Another byproduct of the increase in the complexity of chips is their higher susceptibility to faults. In order to take care of this problem, we need to build fault-tolerant systems. The area of fault-tolerant computing has steadily gained in importance. Today many universities offer courses in the areas of digital system testing and fault-tolerant computing. Due to the impor tance of CMOS technology, a significant portion of these courses may be devoted to CMOS testing. This book has been written as a reference text for such courses offered at the senior or graduate level. Familiarity with logic design and switching theory is assumed. The book should also prove to be useful to professionals working in the semiconductor industry."

Philosophy of Technology - Practical, Historical and Other Dimensions (Hardcover, 1989 ed.): P. T. Durbin Philosophy of Technology - Practical, Historical and Other Dimensions (Hardcover, 1989 ed.)
P. T. Durbin
R4,506 Discovery Miles 45 060 Ships in 10 - 15 working days

The corps of philosophers who make up the Society for Philosophy & Technology has now been collaborating, in one fashion or another, for almost fifteen years. In addition, the number of philosophers, world-wide, who have begun to focus their analytical skills on technology and related social problems grows increasingly every year. {It would certainly swell the ranks if all of them joined the Society ) It seems more than ap propriate, in this context, to publish a miscellaneous volume that em phasizes the extraordinary range and diversity of contemporary contribu tions to the philosophical understanding of the exceedingly complex phenomenon that is modern technology. My thanks, once again, to the anonymous referees who do so much to maintain standards for the series. And thanks also to the secretaries - Mary Imperatore and Dorothy Milsom - in the Philosophy Department at the University of Delaware; their typing and retyping of the MSS, and especially notes and references, also contributes to keeping our standards high. PAUL T. DURBIN vii Paul T. Durbin (ed.), Philosophy ofT echnology, p. vii."

The Annealing Algorithm (Hardcover, 1989 ed.): R.H.J.M. Otten, L.P.P.P.Van Ginneken The Annealing Algorithm (Hardcover, 1989 ed.)
R.H.J.M. Otten, L.P.P.P.Van Ginneken
R4,502 Discovery Miles 45 020 Ships in 10 - 15 working days

The goal of the research out of which this monograph grew, was to make annealing as much as possible a general purpose optimization routine. At first glance this may seem a straight-forward task, for the formulation of its concept suggests applicability to any combinatorial optimization problem. All that is needed to run annealing on such a problem is a unique representation for each configuration, a procedure for measuring its quality, and a neighbor relation. Much more is needed however for obtaining acceptable results consistently in a reasonably short time. It is even doubtful whether the problem can be formulated such that annealing becomes an adequate approach for all instances of an optimization problem. Questions such as what is the best formulation for a given instance, and how should the process be controlled, have to be answered. Although much progress has been made in the years after the introduction of the concept into the field of combinatorial optimization in 1981, some important questions still do not have a definitive answer. In this book the reader will find the foundations of annealing in a self-contained and consistent presentation. Although the physical analogue from which the con cept emanated is mentioned in the first chapter, all theory is developed within the framework of markov chains. To achieve a high degree of instance independence adaptive strategies are introduced."

Technology and Contemporary Life (Paperback, Softcover reprint of the original 1st ed. 1988): P. T. Durbin Technology and Contemporary Life (Paperback, Softcover reprint of the original 1st ed. 1988)
P. T. Durbin
R1,532 Discovery Miles 15 320 Ships in 10 - 15 working days

Nearly everyone agrees that life has changed in our technological society, whether the contrast is with earlier stages in Western culture or with non-Western cultures. "Modernization" is just one of various terms that have been applied to the process by which we have arrived at the peculiar lifestyle typical of our age; whatever the term for the process, almost all analysts agree in finding technology to be one of its key ingredients. This is the judgment of critics of all sorts - anthropologists, historians, literary figures, sociologists, theologians. Volume 4 in the Philosophy and Technology series brings the perspectives of philosophers to bear on the issue of characterizing contemporary life, mainly in high-technology societies. Some of the philosophers look at the issue directly. Others focus on work life - or on the living arrangements that surround or condition or offer refuge from work life in technological society. Still others reflect on particular technologies, especially biotechnology and computer technology, that are increasingly affecting both work and family life. There is also a paper on the nature of thinking in technologi cal praxis, along with two papers on whether it is appropriate to export this sort of thinking to Third World countries, and another paper on the issue of responsibility in technology - which would have fit better in volume 3 of the series, entitled Technology and Responsibility (1987). Finally, volume 4 closes with a broad-ranging bibliography that takes work and technology as its focus."

Relaxation Techniques for the Simulation of VLSI Circuits (Hardcover, 1987 ed.): Jacob K. White, Alberto L.... Relaxation Techniques for the Simulation of VLSI Circuits (Hardcover, 1987 ed.)
Jacob K. White, Alberto L. Sangiovanni-Vincentelli
R3,016 Discovery Miles 30 160 Ships in 10 - 15 working days

Circuit simulation has been a topic of great interest to the integrated circuit design community for many years. It is a difficult, and interesting, problem be cause circuit simulators are very heavily used, consuming thousands of computer hours every year, and therefore the algorithms must be very efficient. In addi tion, circuit simulators are heavily relied upon, with millions of dollars being gambled on their accuracy, and therefore the algorithms must be very robust. At the University of California, Berkeley, a great deal of research has been devoted to the study of both the numerical properties and the efficient imple mentation of circuit simulation algorithms. Research efforts have led to several programs, starting with CANCER in the 1960's and the enormously successful SPICE program in the early 1970's, to MOTIS-C, SPLICE, and RELAX in the late 1970's, and finally to SPLICE2 and RELAX2 in the 1980's. Our primary goal in writing this book was to present some of the results of our current research on the application of relaxation algorithms to circuit simu lation. As we began, we realized that a large body of mathematical and exper imental results had been amassed over the past twenty years by graduate students, professors, and industry researchers working on circuit simulation. It became a secondary goal to try to find an organization of this mass of material that was mathematically rigorous, had practical relevance, and still retained the natural intuitive simplicity of the circuit simulation subject."

Multi-Level Simulation for VLSI Design (Hardcover, 1987 ed.): D. D. Hill, D. R. Coelho Multi-Level Simulation for VLSI Design (Hardcover, 1987 ed.)
D. D. Hill, D. R. Coelho
R3,020 Discovery Miles 30 200 Ships in 10 - 15 working days

AND BACKGROUND 1. 1 CAD, Specification and Simulation Computer Aided Design (CAD) is today a widely used expression referring to the study of ways in which computers can be used to expedite the design process. This can include the design of physical systems, architectural environments, manufacturing processes, and many other areas. This book concentrates on one area of CAD: the design of computer systems. Within this area, it focusses on just two aspects of computer design, the specification and the simulation of digital systems. VLSI design requires support in many other CAD areas, induding automatic layout. IC fabrication analysis, test generation, and others. The problem of specification is unique, however, in that it i > often the first one encountered in large chip designs, and one that is unlikely ever to be completely automated. This is true because until a design's objectives are specified in a machine-readable form, there is no way for other CAD tools to verify that the target system meets them. And unless the specifications can be simulated, it is unlikely that designers will have confidence in them, since specifications are potentially erroneous themselves. (In this context the term target system refers to the hardware and/or software that will ultimately be fabricated. ) On the other hand, since the functionality of a VLSI chip is ultimately determined by its layout geometry, one might question the need for CAD tools that work with areas other than layout.

The Bounding Approach to VLSI Circuit Simulation (Hardcover, 1986 ed.): C.A. Zukowski The Bounding Approach to VLSI Circuit Simulation (Hardcover, 1986 ed.)
C.A. Zukowski
R4,519 Discovery Miles 45 190 Ships in 10 - 15 working days

This book proposes a new approach to circuit simulation that is still in its infancy. The reason for publishing this work as a monograph at this time is to quickly distribute these ideas to the research community for further study. The book is based on a doctoral dissertation undertaken at MIT between 1982 and 1985. In 1982 the author joined a research group that was applying bounding techniques to simple VLSI timing analysis models. The conviction that bounding analysis could also be successfully applied to sophisticated digital MOS circuit models led to the research presented here. Acknowledgments 'me author would like to acknowledge many helpful discussions and much support from his research group at MIT, including Lance Glasser, John Wyatt, Jr., and Paul Penfield, Jr. Many others have also contributed to this work in some way, including Albert Ruchli, Mark Horowitz, Rich Zippel, Chtis Terman, Jacob White, Mark Matson, Bob Armstrong, Steve McCormick, Cyrus Bamji, John Wroclawski, Omar Wing, Gary Dare, Paul Bassett, and Rick LaMaire. The author would like to give special thanks to his wife, Deborra, for her support and many contributions to the presentation of this research. The author would also like to thank his parents for their encouragement, and IBM for its financial support of t, I-Jis project through a graduate fellowship. THE BOUNDING APPROACH TO VLSI CIRCUIT SIMULATION 1. INTRODUCTION The VLSI revolution of the 1970's has created a need for new circuit analysis techniques.

Logic Minimization Algorithms for VLSI Synthesis (Hardcover, 1984 ed.): Robert K. Brayton, Gary D. Hachtel, C. McMullen,... Logic Minimization Algorithms for VLSI Synthesis (Hardcover, 1984 ed.)
Robert K. Brayton, Gary D. Hachtel, C. McMullen, Alberto L. Sangiovanni-Vincentelli
R5,245 Discovery Miles 52 450 Ships in 10 - 15 working days

The roots of the project which culminates with the writing of this book can be traced to the work on logic synthesis started in 1979 at the IBM Watson Research Center and at University of California, Berkeley. During the preliminary phases of these projects, the impor tance of logic minimization for the synthesis of area and performance effective circuits clearly emerged. In 1980, Richard Newton stirred our interest by pointing out new heuristic algorithms for two-level logic minimization and the potential for improving upon existing approaches. In the summer of 1981, the authors organized and participated in a seminar on logic manipulation at IBM Research. One of the goals of the seminar was to study the literature on logic minimization and to look at heuristic algorithms from a fundamental and comparative point of view. The fruits of this investigation were surprisingly abundant: it was apparent from an initial implementation of recursive logic minimiza tion (ESPRESSO-I) that, if we merged our new results into a two-level minimization program, an important step forward in automatic logic synthesis could result. ESPRESSO-II was born and an APL implemen tation was created in the summer of 1982. The results of preliminary tests on a fairly large set of industrial examples were good enough to justify the publication of our algorithms. It is hoped that the strength and speed of our minimizer warrant its Italian name, which denotes both express delivery and a specially-brewed black coffee."

Technology and Reality (Hardcover, 1982 ed.): J. K. Feibleman Technology and Reality (Hardcover, 1982 ed.)
J. K. Feibleman
R3,020 Discovery Miles 30 200 Ships in 10 - 15 working days

In the following pages I have endeavored to show the impact on philosophy of tech nology and science; more specifically, I have tried to make up for the neglect by the classical philosophers of the historic role of technology and also to suggest what positive effects on philosophy the ahnost daily advances in the physical sciences might have. Above all, I wanted to remind the ontologist of his debt to the artificer: tech nology with its recent gigantic achievements has introduced a new ingredient into the world, and so is sure to influence our knowledge of what there is. This book, then, could as well have been called 'Ethnotechnology: An Explanation of Human Behavior by Means of Material Culture', but the picture is a complex one, and there are many more special problems that need to be prominently featured in the discussion. Human culture never goes forward on all fronts at the same time. In our era it is unquestionably not only technology but also the sciences which are making the most rapid progress. Philosophy has not been very successful at keeping up with them. As a consequence there is an 'enormous gulf between scientists and philosophers today, a gulf which is as large as it has ever been. ' (1) I can see that with science moving so rapidly, its current lessons for philosophy might well be outmoded tomorrow."

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Characterization of Biomaterials
Amit Bandyopadhyay, Susmita Bose Paperback R3,224 R2,932 Discovery Miles 29 320
Fortnite Annual 2020
Hardcover  (1)
R199 R186 Discovery Miles 1 860
Navigating Information Literacy
Theo Bothma, Erica Cosijn, … Paperback R633 R557 Discovery Miles 5 570
Best Available and Safest Technologies…
National Research Council, National Academy of Engineering, … Paperback R804 Discovery Miles 8 040
The Story of Microsoft
Nell Musolf Paperback R243 Discovery Miles 2 430
Wasteland - The Dirty Truth About What…
Oliver Franklin-Wallis Hardcover R818 Discovery Miles 8 180
The Geometry of Universal Mind - Volume…
Bob Mustin Paperback R487 R92 Discovery Miles 920
African Women, ICT and Neoliberal…
Assata Zerai Paperback R645 Discovery Miles 6 450
Recalculating - Navigate Your Career…
Lindsey Pollak Paperback R290 Discovery Miles 2 900
Assembling Bus Rapid Transit in the…
Malve Jacobsen Paperback R1,211 Discovery Miles 12 110

 

Partners