0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R2,500 - R5,000 (1)
  • R5,000 - R10,000 (5)
  • -
Status
Brand

Showing 1 - 6 of 6 matches in All Departments

Formal Equivalence Checking and Design Debugging (Hardcover, 1998 ed.): Shi-Yu Huang, Kwang-Ting (Tim) Cheng Formal Equivalence Checking and Design Debugging (Hardcover, 1998 ed.)
Shi-Yu Huang, Kwang-Ting (Tim) Cheng
R5,342 Discovery Miles 53 420 Ships in 10 - 15 working days

Formal Equivalence Checking and Design Debugging covers two major topics in design verification: logic equivalence checking and design debugging. The first part of the book reviews the design problems that require logic equivalence checking and describes the underlying technologies that are used to solve them. Some novel approaches to the problems of verifying design revisions after intensive sequential transformations such as retiming are described in detail. The second part of the book gives a thorough survey of previous and recent literature on design error diagnosis and design error correction. This part also provides an in-depth analysis of the algorithms used in two logic debugging software programs, ErrorTracer and AutoFix, developed by the authors. From the Foreword: With the adoption of the static sign-off approach to verifying circuit implementations the application-specific integrated circuit (ASIC) industry will experience the first radical methodological revolution since the adoption of logic synthesis. Equivalence checking is one of the two critical elements of this methodological revolution. This book is timely for either the designer seeking to better understand the mechanics of equivalence checking or for the CAD researcher who wishes to investigate well-motivated research problems such as equivalence checking of retimed designs or error diagnosis in sequential circuits.' Kurt Keutzer, University of California, Berkeley

Stochastic Versus Fuzzy Approaches to Multiobjective Mathematical Programming under Uncertainty (Hardcover, 1990 ed.): Shi-Yu... Stochastic Versus Fuzzy Approaches to Multiobjective Mathematical Programming under Uncertainty (Hardcover, 1990 ed.)
Shi-Yu Huang, Jaques Teghem
R5,964 Discovery Miles 59 640 Ships in 10 - 15 working days

Operations Research is a field whose major contribution has been to propose a rigorous fonnulation of often ill-defmed problems pertaining to the organization or the design of large scale systems, such as resource allocation problems, scheduling and the like. While this effort did help a lot in understanding the nature of these problems, the mathematical models have proved only partially satisfactory due to the difficulty in gathering precise data, and in formulating objective functions that reflect the multi-faceted notion of optimal solution according to human experts. In this respect linear programming is a typical example of impressive achievement of Operations Research, that in its detenninistic fonn is not always adapted to real world decision-making : everything must be expressed in tenns of linear constraints ; yet the coefficients that appear in these constraints may not be so well-defined, either because their value depends upon other parameters (not accounted for in the model) or because they cannot be precisely assessed, and only qualitative estimates of these coefficients are available. Similarly the best solution to a linear programming problem may be more a matter of compromise between various criteria rather than just minimizing or maximizing a linear objective function. Lastly the constraints, expressed by equalities or inequalities between linear expressions, are often softer in reality that what their mathematical expression might let us believe, and infeasibility as detected by the linear programming techniques can often been coped with by making trade-offs with the real world.

Intelligent Decision Support - Handbook of Applications and Advances of the Rough Sets Theory (Hardcover, 1992 ed.): Shi-Yu... Intelligent Decision Support - Handbook of Applications and Advances of the Rough Sets Theory (Hardcover, 1992 ed.)
Shi-Yu Huang
R8,613 Discovery Miles 86 130 Ships in 12 - 17 working days

Intelligent decision support is based on human knowledge related to a specific part of a real or abstract world. When the knowledge is gained by experience, it is induced from empirical data. The data structure, called an information system, is a record of objects described by a set of attributes. Knowledge is understood here as an ability to classify objects. Objects being in the same class are indiscernible by means of attributes and form elementary building blocks (granules, atoms). In particular, the granularity of knowledge causes that some notions cannot be expressed precisely within available knowledge and can be defined only vaguely. In the rough sets theory created by Z. Pawlak each imprecise concept is replaced by a pair of precise concepts called its lower and upper approximation. These approximations are fundamental tools and reasoning about knowledge. The rough sets philosophy turned out to be a very effective, new tool with many successful real-life applications to its credit. It is worthwhile stressing that no auxiliary assumptions are needed about data, like probability or membership function values, which is its great advantage. The present book reveals a wide spectrum of applications of the rough set concept, giving the reader the flavor of, and insight into, the methodology of the newly developed disciplines. Although the book emphasizes applications, comparison with other related methods and further developments receive due attention.

Formal Equivalence Checking and Design Debugging (Paperback, Softcover reprint of the original 1st ed. 1998): Shi-Yu Huang,... Formal Equivalence Checking and Design Debugging (Paperback, Softcover reprint of the original 1st ed. 1998)
Shi-Yu Huang, Kwang-Ting (Tim) Cheng
R5,185 Discovery Miles 51 850 Ships in 10 - 15 working days

Formal Equivalence Checking and Design Debugging covers two major topics in design verification: logic equivalence checking and design debugging. The first part of the book reviews the design problems that require logic equivalence checking and describes the underlying technologies that are used to solve them. Some novel approaches to the problems of verifying design revisions after intensive sequential transformations such as retiming are described in detail. The second part of the book gives a thorough survey of previous and recent literature on design error diagnosis and design error correction. This part also provides an in-depth analysis of the algorithms used in two logic debugging software programs, ErrorTracer and AutoFix, developed by the authors. From the Foreword: `With the adoption of the static sign-off approach to verifying circuit implementations the application-specific integrated circuit (ASIC) industry will experience the first radical methodological revolution since the adoption of logic synthesis. Equivalence checking is one of the two critical elements of this methodological revolution. This book is timely for either the designer seeking to better understand the mechanics of equivalence checking or for the CAD researcher who wishes to investigate well-motivated research problems such as equivalence checking of retimed designs or error diagnosis in sequential circuits.' Kurt Keutzer, University of California, Berkeley

Stochastic Versus Fuzzy Approaches to Multiobjective Mathematical Programming under Uncertainty (Paperback, Softcover reprint... Stochastic Versus Fuzzy Approaches to Multiobjective Mathematical Programming under Uncertainty (Paperback, Softcover reprint of the original 1st ed. 1990)
Shi-Yu Huang, Jaques Teghem
R5,746 Discovery Miles 57 460 Ships in 10 - 15 working days

Operations Research is a field whose major contribution has been to propose a rigorous fonnulation of often ill-defmed problems pertaining to the organization or the design of large scale systems, such as resource allocation problems, scheduling and the like. While this effort did help a lot in understanding the nature of these problems, the mathematical models have proved only partially satisfactory due to the difficulty in gathering precise data, and in formulating objective functions that reflect the multi-faceted notion of optimal solution according to human experts. In this respect linear programming is a typical example of impressive achievement of Operations Research, that in its detenninistic fonn is not always adapted to real world decision-making : everything must be expressed in tenns of linear constraints ; yet the coefficients that appear in these constraints may not be so well-defined, either because their value depends upon other parameters (not accounted for in the model) or because they cannot be precisely assessed, and only qualitative estimates of these coefficients are available. Similarly the best solution to a linear programming problem may be more a matter of compromise between various criteria rather than just minimizing or maximizing a linear objective function. Lastly the constraints, expressed by equalities or inequalities between linear expressions, are often softer in reality that what their mathematical expression might let us believe, and infeasibility as detected by the linear programming techniques can often been coped with by making trade-offs with the real world.

Intelligent Decision Support - Handbook of Applications and Advances of the Rough Sets Theory (Paperback, Softcover reprint of... Intelligent Decision Support - Handbook of Applications and Advances of the Rough Sets Theory (Paperback, Softcover reprint of hardcover 1st ed. 1992)
Shi-Yu Huang
R8,537 Discovery Miles 85 370 Ships in 10 - 15 working days

Intelligent decision support is based on human knowledge related to a specific part of a real or abstract world. When the knowledge is gained by experience, it is induced from empirical data. The data structure, called an information system, is a record of objects described by a set of attributes. Knowledge is understood here as an ability to classify objects. Objects being in the same class are indiscernible by means of attributes and form elementary building blocks (granules, atoms). In particular, the granularity of knowledge causes that some notions cannot be expressed precisely within available knowledge and can be defined only vaguely. In the rough sets theory created by Z. Pawlak each imprecise concept is replaced by a pair of precise concepts called its lower and upper approximation. These approximations are fundamental tools and reasoning about knowledge. The rough sets philosophy turned out to be a very effective, new tool with many successful real-life applications to its credit. It is worthwhile stressing that no auxiliary assumptions are needed about data, like probability or membership function values, which is its great advantage. The present book reveals a wide spectrum of applications of the rough set concept, giving the reader the flavor of, and insight into, the methodology of the newly developed disciplines. Although the book emphasizes applications, comparison with other related methods and further developments receive due attention.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Nintendo Switch OLED Edition Console…
R9,299 Discovery Miles 92 990
Brother LC472XLY Ink Cartridge (Yellow…
R449 R419 Discovery Miles 4 190
Loot
Nadine Gordimer Paperback  (2)
R205 R168 Discovery Miles 1 680
Bestway Heavy Duty Repair Patch
R30 R24 Discovery Miles 240
Casio LW-200-7AV Watch with 10-Year…
R999 R884 Discovery Miles 8 840
Dune: Part 2
Timothee Chalamet, Zendaya, … DVD R221 Discovery Miles 2 210
Loot
Nadine Gordimer Paperback  (2)
R205 R168 Discovery Miles 1 680
Loot
Nadine Gordimer Paperback  (2)
R205 R168 Discovery Miles 1 680
Aerolatte Cappuccino Art Stencils (Set…
R110 R95 Discovery Miles 950
Bug-A-Salt 3.0 Black Fly
 (3)
R999 R749 Discovery Miles 7 490

 

Partners