0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R2,500 - R5,000 (5)
  • R5,000 - R10,000 (4)
  • -
Status
Brand

Showing 1 - 9 of 9 matches in All Departments

From Rural China to the Ivy League - Reminiscences of Transformations in Modern Chinese History (Hardcover): Ying Shi Yu From Rural China to the Ivy League - Reminiscences of Transformations in Modern Chinese History (Hardcover)
Ying Shi Yu; Contributions by Michael S. Duke, Josephine Chiu-Duke
R2,513 Discovery Miles 25 130 Ships in 18 - 22 working days
Formal Equivalence Checking and Design Debugging (Hardcover, 1998 ed.): Shi-Yu Huang, Kwang-Ting (Tim) Cheng Formal Equivalence Checking and Design Debugging (Hardcover, 1998 ed.)
Shi-Yu Huang, Kwang-Ting (Tim) Cheng
R4,820 Discovery Miles 48 200 Ships in 18 - 22 working days

Formal Equivalence Checking and Design Debugging covers two major topics in design verification: logic equivalence checking and design debugging. The first part of the book reviews the design problems that require logic equivalence checking and describes the underlying technologies that are used to solve them. Some novel approaches to the problems of verifying design revisions after intensive sequential transformations such as retiming are described in detail. The second part of the book gives a thorough survey of previous and recent literature on design error diagnosis and design error correction. This part also provides an in-depth analysis of the algorithms used in two logic debugging software programs, ErrorTracer and AutoFix, developed by the authors. From the Foreword: With the adoption of the static sign-off approach to verifying circuit implementations the application-specific integrated circuit (ASIC) industry will experience the first radical methodological revolution since the adoption of logic synthesis. Equivalence checking is one of the two critical elements of this methodological revolution. This book is timely for either the designer seeking to better understand the mechanics of equivalence checking or for the CAD researcher who wishes to investigate well-motivated research problems such as equivalence checking of retimed designs or error diagnosis in sequential circuits.' Kurt Keutzer, University of California, Berkeley

Stochastic Versus Fuzzy Approaches to Multiobjective Mathematical Programming under Uncertainty (Hardcover, 1990 ed.): Shi-Yu... Stochastic Versus Fuzzy Approaches to Multiobjective Mathematical Programming under Uncertainty (Hardcover, 1990 ed.)
Shi-Yu Huang, Jaques Teghem
R5,378 Discovery Miles 53 780 Ships in 18 - 22 working days

Operations Research is a field whose major contribution has been to propose a rigorous fonnulation of often ill-defmed problems pertaining to the organization or the design of large scale systems, such as resource allocation problems, scheduling and the like. While this effort did help a lot in understanding the nature of these problems, the mathematical models have proved only partially satisfactory due to the difficulty in gathering precise data, and in formulating objective functions that reflect the multi-faceted notion of optimal solution according to human experts. In this respect linear programming is a typical example of impressive achievement of Operations Research, that in its detenninistic fonn is not always adapted to real world decision-making : everything must be expressed in tenns of linear constraints ; yet the coefficients that appear in these constraints may not be so well-defined, either because their value depends upon other parameters (not accounted for in the model) or because they cannot be precisely assessed, and only qualitative estimates of these coefficients are available. Similarly the best solution to a linear programming problem may be more a matter of compromise between various criteria rather than just minimizing or maximizing a linear objective function. Lastly the constraints, expressed by equalities or inequalities between linear expressions, are often softer in reality that what their mathematical expression might let us believe, and infeasibility as detected by the linear programming techniques can often been coped with by making trade-offs with the real world.

Intelligent Decision Support - Handbook of Applications and Advances of the Rough Sets Theory (Hardcover, 1992 ed.): Shi-Yu... Intelligent Decision Support - Handbook of Applications and Advances of the Rough Sets Theory (Hardcover, 1992 ed.)
Shi-Yu Huang
R7,899 Discovery Miles 78 990 Ships in 18 - 22 working days

Intelligent decision support is based on human knowledge related to a specific part of a real or abstract world. When the knowledge is gained by experience, it is induced from empirical data. The data structure, called an information system, is a record of objects described by a set of attributes. Knowledge is understood here as an ability to classify objects. Objects being in the same class are indiscernible by means of attributes and form elementary building blocks (granules, atoms). In particular, the granularity of knowledge causes that some notions cannot be expressed precisely within available knowledge and can be defined only vaguely. In the rough sets theory created by Z. Pawlak each imprecise concept is replaced by a pair of precise concepts called its lower and upper approximation. These approximations are fundamental tools and reasoning about knowledge. The rough sets philosophy turned out to be a very effective, new tool with many successful real-life applications to its credit. It is worthwhile stressing that no auxiliary assumptions are needed about data, like probability or membership function values, which is its great advantage. The present book reveals a wide spectrum of applications of the rough set concept, giving the reader the flavor of, and insight into, the methodology of the newly developed disciplines. Although the book emphasizes applications, comparison with other related methods and further developments receive due attention.

Kernel-based Data Fusion for Machine Learning - Methods and Applications in Bioinformatics and Text Mining (Hardcover,... Kernel-based Data Fusion for Machine Learning - Methods and Applications in Bioinformatics and Text Mining (Hardcover, Edition.)
Shi Yu, Leon-Charles Tranchevent, Bart Moor, Yves Moreau
R4,686 Discovery Miles 46 860 Ships in 18 - 22 working days

Data fusion problems arise frequently in many different fields. This book provides a specific introduction to data fusion problems using support vector machines. In the first part, this book begins with a brief survey of additive models and Rayleigh quotient objectives in machine learning, and then introduces kernel fusion as the additive expansion of support vector machines in the dual problem. The second part presents several novel kernel fusion algorithms and some real applications in supervised and unsupervised learning. The last part of the book substantiates the value of the proposed theories and algorithms in MerKator, an open software to identify disease relevant genes based on the integration of heterogeneous genomic data sources in multiple species. The topics presented in this book are meant for researchers or students who use support vector machines. Several topics addressed in the book may also be interesting to computational biologists who want to tackle data fusion challenges in real applications. The background required of the reader is a good knowledge of data mining, machine learning and linear algebra.

Kernel-based Data Fusion for Machine Learning - Methods and Applications in Bioinformatics and Text Mining (Paperback, 2011... Kernel-based Data Fusion for Machine Learning - Methods and Applications in Bioinformatics and Text Mining (Paperback, 2011 ed.)
Shi Yu, Leon-Charles Tranchevent, Bart Moor, Yves Moreau
R4,673 Discovery Miles 46 730 Ships in 18 - 22 working days

Data fusion problems arise frequently in many different fields. This book provides a specific introduction to data fusion problems using support vector machines. In the first part, this book begins with a brief survey of additive models and Rayleigh quotient objectives in machine learning, and then introduces kernel fusion as the additive expansion of support vector machines in the dual problem. The second part presents several novel kernel fusion algorithms and some real applications in supervised and unsupervised learning. The last part of the book substantiates the value of the proposed theories and algorithms in MerKator, an open software to identify disease relevant genes based on the integration of heterogeneous genomic data sources in multiple species. The topics presented in this book are meant for researchers or students who use support vector machines. Several topics addressed in the book may also be interesting to computational biologists who want to tackle data fusion challenges in real applications. The background required of the reader is a good knowledge of data mining, machine learning and linear algebra.

Formal Equivalence Checking and Design Debugging (Paperback, Softcover reprint of the original 1st ed. 1998): Shi-Yu Huang,... Formal Equivalence Checking and Design Debugging (Paperback, Softcover reprint of the original 1st ed. 1998)
Shi-Yu Huang, Kwang-Ting (Tim) Cheng
R4,679 Discovery Miles 46 790 Ships in 18 - 22 working days

Formal Equivalence Checking and Design Debugging covers two major topics in design verification: logic equivalence checking and design debugging. The first part of the book reviews the design problems that require logic equivalence checking and describes the underlying technologies that are used to solve them. Some novel approaches to the problems of verifying design revisions after intensive sequential transformations such as retiming are described in detail. The second part of the book gives a thorough survey of previous and recent literature on design error diagnosis and design error correction. This part also provides an in-depth analysis of the algorithms used in two logic debugging software programs, ErrorTracer and AutoFix, developed by the authors. From the Foreword: `With the adoption of the static sign-off approach to verifying circuit implementations the application-specific integrated circuit (ASIC) industry will experience the first radical methodological revolution since the adoption of logic synthesis. Equivalence checking is one of the two critical elements of this methodological revolution. This book is timely for either the designer seeking to better understand the mechanics of equivalence checking or for the CAD researcher who wishes to investigate well-motivated research problems such as equivalence checking of retimed designs or error diagnosis in sequential circuits.' Kurt Keutzer, University of California, Berkeley

Intelligent Decision Support - Handbook of Applications and Advances of the Rough Sets Theory (Paperback, Softcover reprint of... Intelligent Decision Support - Handbook of Applications and Advances of the Rough Sets Theory (Paperback, Softcover reprint of hardcover 1st ed. 1992)
Shi-Yu Huang
R7,688 Discovery Miles 76 880 Ships in 18 - 22 working days

Intelligent decision support is based on human knowledge related to a specific part of a real or abstract world. When the knowledge is gained by experience, it is induced from empirical data. The data structure, called an information system, is a record of objects described by a set of attributes. Knowledge is understood here as an ability to classify objects. Objects being in the same class are indiscernible by means of attributes and form elementary building blocks (granules, atoms). In particular, the granularity of knowledge causes that some notions cannot be expressed precisely within available knowledge and can be defined only vaguely. In the rough sets theory created by Z. Pawlak each imprecise concept is replaced by a pair of precise concepts called its lower and upper approximation. These approximations are fundamental tools and reasoning about knowledge. The rough sets philosophy turned out to be a very effective, new tool with many successful real-life applications to its credit. It is worthwhile stressing that no auxiliary assumptions are needed about data, like probability or membership function values, which is its great advantage. The present book reveals a wide spectrum of applications of the rough set concept, giving the reader the flavor of, and insight into, the methodology of the newly developed disciplines. Although the book emphasizes applications, comparison with other related methods and further developments receive due attention.

Stochastic Versus Fuzzy Approaches to Multiobjective Mathematical Programming under Uncertainty (Paperback, Softcover reprint... Stochastic Versus Fuzzy Approaches to Multiobjective Mathematical Programming under Uncertainty (Paperback, Softcover reprint of the original 1st ed. 1990)
Shi-Yu Huang, Jaques Teghem
R5,182 Discovery Miles 51 820 Ships in 18 - 22 working days

Operations Research is a field whose major contribution has been to propose a rigorous fonnulation of often ill-defmed problems pertaining to the organization or the design of large scale systems, such as resource allocation problems, scheduling and the like. While this effort did help a lot in understanding the nature of these problems, the mathematical models have proved only partially satisfactory due to the difficulty in gathering precise data, and in formulating objective functions that reflect the multi-faceted notion of optimal solution according to human experts. In this respect linear programming is a typical example of impressive achievement of Operations Research, that in its detenninistic fonn is not always adapted to real world decision-making : everything must be expressed in tenns of linear constraints ; yet the coefficients that appear in these constraints may not be so well-defined, either because their value depends upon other parameters (not accounted for in the model) or because they cannot be precisely assessed, and only qualitative estimates of these coefficients are available. Similarly the best solution to a linear programming problem may be more a matter of compromise between various criteria rather than just minimizing or maximizing a linear objective function. Lastly the constraints, expressed by equalities or inequalities between linear expressions, are often softer in reality that what their mathematical expression might let us believe, and infeasibility as detected by the linear programming techniques can often been coped with by making trade-offs with the real world.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
High-dimensional Microarray Data…
Shuichi Shinmura Hardcover R3,401 Discovery Miles 34 010
Generalized Method of Moments
Alastair R. Hall Hardcover R4,489 Discovery Miles 44 890
Understanding Linux Network Internals
Christian Benvenuti Paperback R1,396 R1,142 Discovery Miles 11 420
Combinatorial Functional Equations…
Yanpei Liu Hardcover R4,369 Discovery Miles 43 690
Alternatives to Domestic Violence - A…
Kevin A. Fall, Shareen Howard Hardcover R4,230 Discovery Miles 42 300
Boolean Representations of Simplicial…
John Rhodes, Pedro V. Silva Hardcover R2,591 R1,664 Discovery Miles 16 640
Between Two Fires - Holding The Liberal…
John Kane-Berman Paperback  (3)
R320 Discovery Miles 3 200
TP-Link VIGI C420I-2.8 2MP IR Turret…
R672 Discovery Miles 6 720
Black And White Bioscope - Making Movies…
Neil Parsons Hardcover R339 Discovery Miles 3 390
Cudy AC1200 Wi-Fi Mesh Router (White…
R699 R406 Discovery Miles 4 060

 

Partners