![]() |
![]() |
Your cart is empty |
||
Books > Science & Mathematics > Mathematics > Mathematical foundations > General
Rule-based fuzzy modeling has been recognised as a powerful technique for the modeling of partly-known nonlinear systems. Fuzzy models can effectively integrate information from different sources, such as physical laws, empirical models, measurements and heuristics. Application areas of fuzzy models include prediction, decision support, system analysis, control design, etc. Fuzzy Modeling for Control addresses fuzzy modeling from the systems and control engineering points of view. It focuses on the selection of appropriate model structures, on the acquisition of dynamic fuzzy models from process measurements (fuzzy identification), and on the design of nonlinear controllers based on fuzzy models. To automatically generate fuzzy models from measurements, a comprehensive methodology is developed which employs fuzzy clustering techniques to partition the available data into subsets characterized by locally linear behaviour. The relationships between the presented identification method and linear regression are exploited, allowing for the combination of fuzzy logic techniques with standard system identification tools. Attention is paid to the trade-off between the accuracy and transparency of the obtained fuzzy models. Control design based on a fuzzy model of a nonlinear dynamic process is addressed, using the concepts of model-based predictive control and internal model control with an inverted fuzzy model. To this end, methods to exactly invert specific types of fuzzy models are presented. In the context of predictive control, branch-and-bound optimization is applied. The main features of the presented techniques are illustrated by means of simple examples. In addition, three real-world applications are described. Finally, software tools for building fuzzy models from measurements are available from the author.
Fuzzy Logic: State of the Art covers a wide range of both theory and applications of fuzzy sets, ranging from mathematical basics, through artificial intelligence, computer management and systems science to engineering applications. Fuzzy Logic will be of interest to researchers working in fuzzy set theory and its applications.
also in: THE KLUWER INTERNATIONAL SERIES ON ASIAN STUDIES IN COMPUTER AND INFORMATION SCIENCE, Volume 2
In recent years, there have been several attempts to define a logic for information retrieval (IR). The aim was to provide a rich and uniform representation of information and its semantics with the goal of improving retrieval effectiveness. The basis of a logical model for IR is the assumption that queries and documents can be represented effectively by logical formulae. To retrieve a document, an IR system has to infer the formula representing the query from the formula representing the document. This logical interpretation of query and document emphasizes that relevance in IR is an inference process. The use of logic to build IR models enables one to obtain models that are more general than earlier well-known IR models. Indeed, some logical models are able to represent within a uniform framework various features of IR systems such as hypermedia links, multimedia data, and user's knowledge. Logic also provides a common approach to the integration of IR systems with logical database systems. Finally, logic makes it possible to reason about an IR model and its properties. This latter possibility is becoming increasingly more important since conventional evaluation methods, although good indicators of the effectiveness of IR systems, often give results which cannot be predicted, or for that matter satisfactorily explained. However, logic by itself cannot fully model IR. The success or the failure of the inference of the query formula from the document formula is not enough to model relevance in IR. It is necessary to take into account the uncertainty inherent in such an inference process. In 1986, Van Rijsbergen proposed the uncertainty logical principle to model relevance as an uncertain inference process. When proposing the principle, Van Rijsbergen was not specific about which logic and which uncertainty theory to use. As a consequence, various logics and uncertainty theories have been proposed and investigated. The choice of an appropriate logic and uncertainty mechanism has been a main research theme in logical IR modeling leading to a number of logical IR models over the years. Information Retrieval: Uncertainty and Logics contains a collection of exciting papers proposing, developing and implementing logical IR models. This book is appropriate for use as a text for a graduate-level course on Information Retrieval or Database Systems, and as a reference for researchers and practitioners in industry.
Logic and Philosophy of Mathematics in the Early Husserl focuses on the first ten years of Edmund Husserl's work, from the publication of his Philosophy of Arithmetic (1891) to that of his Logical Investigations (1900/01), and aims to precisely locate his early work in the fields of logic, philosophy of logic and philosophy of mathematics. Unlike most phenomenologists, the author refrains from reading Husserl's early work as a more or less immature sketch of claims consolidated only in his later phenomenology, and unlike the majority of historians of logic she emphasizes the systematic strength and the originality of Husserl's logico-mathematical work. The book attempts to reconstruct the discussion between Husserl and those philosophers and mathematicians who contributed to new developments in logic, such as Leibniz, Bolzano, the logical algebraists (especially Boole and Schroder), Frege, and Hilbert and his school. It presents both a comprehensive critical examination of some of the major works produced by Husserl and his antagonists in the last decade of the 19th century and a formal reconstruction of many texts from Husserl's Nachlass that have not yet been the object of systematical scrutiny. This volume will be of particular interest to researchers working in the history, and in the philosophy, of logic and mathematics, and more generally, to analytical philosophers and phenomenologists with a background in standard logic.
Neural Networks and Fuzzy Systems: Theory and Applications discusses theories that have proven useful in applying neural networks and fuzzy systems to real world problems. The book includes performance comparison of neural networks and fuzzy systems using data gathered from real systems. Topics covered include the Hopfield network for combinatorial optimization problems, multilayered neural networks for pattern classification and function approximation, fuzzy systems that have the same functions as multilayered networks, and composite systems that have been successfully applied to real world problems. The author also includes representative neural network models such as the Kohonen network and radial basis function network. New fuzzy systems with learning capabilities are also covered. The advantages and disadvantages of neural networks and fuzzy systems are examined. The performance of these two systems in license plate recognition, a water purification plant, blood cell classification, and other real world problems is compared.
Computer systems that analyze images are critical to a wide variety of applications such as visual inspections systems for various manufacturing processes, remote sensing of the environment from space-borne imaging platforms, and automatic diagnosis from X-rays and other medical imaging sources. Professor Azriel Rosenfeld, the founder of the field of digital image analysis, made fundamental contributions to a wide variety of problems in image processing, pattern recognition and computer vision. Professor Rosenfeld's previous students, postdoctoral scientists, and colleagues illustrate in Foundations of Image Understanding how current research has been influenced by his work as the leading researcher in the area of image analysis for over two decades. Each chapter of Foundations of Image Understanding is written by one of the world's leading experts in his area of specialization, examining digital geometry and topology (early research which laid the foundations for many industrial machine vision systems), edge detection and segmentation (fundamental to systems that analyze complex images of our three-dimensional world), multi-resolution and variable resolution representations for images and maps, parallel algorithms and systems for image analysis, and the importance of human psychophysical studies of vision to the design of computer vision systems. Professor Rosenfeld's chapter briefly discusses topics not covered in the contributed chapters, providing a personal, historical perspective on the development of the field of image understanding. Foundations of Image Understanding is an excellent source of basic material for both graduate students entering the field and established researchers who require a compact source for many of the foundational topics in image analysis.
Call-by-push-value is a programming language paradigm that,
surprisingly, breaks down the call-by-value and call-by-name
paradigms into simple primitives. This monograph, written for
graduate students and researchers, exposes the call-by-push-value
structure underlying a remarkable range of semantics, including
operational semantics, domains, possible worlds, continuations and
games.
One of the important areas of contemporary combinatorics is Ramsey theory. Ramsey theory is basically the study of structure preserved under partitions. The general philosophy is reflected by its interdisciplinary character. The ideas of Ramsey theory are shared by logicians, set theorists and combinatorists, and have been successfully applied in other branches of mathematics. The whole subject is quickly developing and has some new and unexpected applications in areas as remote as functional analysis and theoretical computer science. This book is a homogeneous collection of research and survey articles by leading specialists. It surveys recent activity in this diverse subject and brings the reader up to the boundary of present knowledge. It covers virtually all main approaches to the subject and suggests various problems for individual research.
This monograph studies the logical aspects of domains as used in de notational semantics of programming languages. Frameworks of domain logics are introduced; these serve as foundations for systematic derivations of proof systems from denotational semantics of programming languages. Any proof system so derived is guaranteed to agree with denotational se mantics in the sense that the denotation of any program coincides with the set of assertions true of it. The study focuses on two categories for dena tational semantics: SFP domains, and the less standard, but important, category of stable domains. The intended readership of this monograph includes researchers and graduate students interested in the relation between semantics of program ming languages and formal means of reasoning about programs. A basic knowledge of denotational semantics, mathematical logic, general topology, and category theory is helpful for a full understanding of the material. Part I SFP Domains Chapter 1 Introduction This chapter provides a brief exposition to domain theory, denotational se mantics, program logics, and proof systems. It discusses the importance of ideas and results on logic and topology to the understanding of the relation between denotational semantics and program logics. It also describes the motivation for the work presented by this monograph, and how that work fits into a more general program. Finally, it gives a short summary of the results of each chapter. 1. 1 Domain Theory Programming languages are languages with which to perform computa tion."
This book constitutes the thoroughly refereed post-conference proceedings of the 20th International Workshop on Algebraic Development Techniques, WADT 2010, held in July 2010 in Etelsen, Germany. The 15 revised papers presented were carefully reviewed and selected from 32 presentations. The workshop deals with the following topics: foundations of algebraic specification; other approaches to formal specification including process calculi and models of concurrent, distributed and mobile computing; specification languages, methods, and environments; semantics of conceptual modeling methods and techniques; model-driven development; graph transformations, term rewriting and proof systems; integration of formal specification techniques; formal testing and quality assurance validation, and verification.
This is the first book-length treatment of hybrid logic and its proof-theory. Hybrid logic is an extension of ordinary modal logic which allows explicit reference to individual points in a model (where the points represent times, possible worlds, states in a computer, or something else). This is useful for many applications, for example when reasoning about time one often wants to formulate a series of statements about what happens at specific times. There is little consensus about proof-theory for ordinary modal logic. Many modal-logical proof systems lack important properties and the relationships between proof systems for different modal logics are often unclear. In the present book we demonstrate that hybrid-logical proof-theory remedies these deficiencies by giving a spectrum of well-behaved proof systems (natural deduction, Gentzen, tableau, and axiom systems) for a spectrum of different hybrid logics (propositional, first-order, intensional first-order, and intuitionistic).
"Kind of crude, but it works, boy, it works " AZan NeweZZ to Herb Simon, Christmas 1955 In 1954 a computer program produced what appears to be the first computer generated mathematical proof: Written by M. Davis at the Institute of Advanced Studies, USA, it proved a number theoretic theorem in Presburger Arithmetic. Christmas 1955 heralded a computer program which generated the first proofs of some propositions of Principia Mathematica, developed by A. Newell, J. Shaw, and H. Simon at RAND Corporation, USA. In Sweden, H. Prawitz, D. Prawitz, and N. Voghera produced the first general program for the full first order predicate calculus to prove mathematical theorems; their computer proofs were obtained around 1957 and 1958, about the same time that H. Gelernter finished a computer program to prove simple high school geometry theorems. Since the field of computational logic (or automated theorem proving) is emerging from the ivory tower of academic research into real world applications, asserting also a definite place in many university curricula, we feel the time has corne to examine and evaluate its history. The article by Martin Davis in the first of this series of volumes traces the most influential ideas back to the 'prehistory' of early logical thought showing how these ideas influenced the underlying concepts of most early automatic theorem proving programs.
Coalgebraic logic is an important research topic in the areas of concurrency theory, semantics, transition systems and modal logics. It provides a general approach to modeling systems, allowing us to apply important results from coalgebras, universal algebra and category theory in novel ways. Stochastic systems provide important tools for systems modeling, and recent work shows that categorical reasoning may lead to new insights, previously not available in a purely probabilistic setting. This book combines coalgebraic reasoning, stochastic systems and logics. It provides an insight into the principles of coalgebraic logic from a categorical point of view, and applies these systems to interpretations of stochastic coalgebraic logics, which include well-known modal logics and continuous time branching logics. The author introduces stochastic systems together with their probabilistic and categorical foundations and gives a comprehensive discussion of the Giry monad as the underlying categorical construction, presenting many new, hitherto unpublished results. He discusses modal logics, introduces their probabilistic interpretations, and then proceeds to an analysis of Kripke models for coalgebraic logics. The book will be of interest to researchers in theoretical computer science, logic and category theory.
1. The ?rst edition of this book was published in 1977. The text has been well received and is still used, although it has been out of print for some time. In the intervening three decades, a lot of interesting things have happened to mathematical logic: (i) Model theory has shown that insights acquired in the study of formal languages could be used fruitfully in solving old problems of conventional mathematics. (ii) Mathematics has been and is moving with growing acceleration from the set-theoretic language of structures to the language and intuition of (higher) categories, leaving behind old concerns about in?nities: a new view of foundations is now emerging. (iii) Computer science, a no-nonsense child of the abstract computability theory, has been creatively dealing with old challenges and providing new ones, such as the P/NP problem. Planning additional chapters for this second edition, I have decided to focus onmodeltheory, the conspicuousabsenceofwhichinthe ?rsteditionwasnoted in several reviews, and the theory of computation, including its categorical and quantum aspects. The whole Part IV: Model Theory, is new. I am very grateful to Boris I. Zilber, who kindly agreed to write it. It may be read directly after Chapter II. The contents of the ?rst edition are basically reproduced here as Chapters I-VIII. Section IV.7, on the cardinality of the continuum, is completed by Section IV.7.3, discussing H. Woodin's discovery.
This collection of papers has its origin in a conference held at the Uni- versity of Toronto in June of 1988. The theme of the conference was Physicalism in Mathematics: Recent Work in the Philosophy of Math- ematics. At the conference, papers were read by Geoffrey Hellman (Minnesota), Yvon Gauthier (Montreal), Michael Hallett (McGill), Hartry Field (USC), Bob Hale (Lancaster & St Andrew's), Alasdair Urquhart (Toronto) and Penelope Maddy (Irvine). This volume supplements updated versions of six of those papers with contributions by Jim Brown (Toronto), John Bigelow (La Trobe), John Burgess (Princeton), Chandler Davis (Toronto), David Papineau (Cambridge), Michael Resnik (North Carolina at Chapel Hill), Peter Simons (Salzburg) and Crispin Wright (St Andrews & Michigan). Together they provide a vivid, expansive snapshot of the exciting work which is currently being carried out in philosophy of mathematics. Generous financial support for the original conference was provided by the Social Sciences & Humanities Research Council of Canada, the British Council, and the Department of Philosophy together with the Office of Internal Relations at the University of Toronto. Additional support for the production of this volume was gratefully received from the Social Sciences & Humanities Research Council of Canada.
The theory of fuzzy sets has become known in Czechoslovakia in the early seventies. Since then, it was applied in various areas of science, engineering and economics where indeterminate concepts had to be handled. There has been a number of national semi- nars and conferences devoted to this topic. However, the International Symposium on Fuzzy Approach to Reasoning and Decision-Making, held in 1990, was the first really representative international meeting of this kind organized in Czechoslovakia. The symposium took place in the House of Scientists of the Czechoslovak Academy of Sciences in Bechyne from June 25 till 29, 1990. Its main organizer was Mining In- stitute of the Czechoslovak Academy of Sciences in Ostrava in cooperation and support of several other institutions and organizations. A crucial role in preparing of the Sym- posium was played by the working group for Fuzzy Sets and Systems which is active in the frame of the Society of Czechoslovak Mathematicians and Physicists. The organizing and program committee was headed by Dr. Vilem Novak from the Mining Institute in Ostrava. Its members (in alphabetical order) were Dr. Martin Cerny (Prague), Prof. Bla- hoslav Harman (Liptovsky Mikulas), Ema Hyklova (Prague), Prof. Zdenek Karpfsek (Brno), Jan Laub (Prague), Dr. Milan MareS - vice-chairman (Prague), Prof. Radko Mesiar (Bratislava), Dr. Jifi Nekola - vice-chairman (Prague), Daria Novakova (Os- trava), Dr. Jaroslav Ramfk (Ostrava), Prof. Dr. Beloslav Riecan (Bratislava), Dr. Jana TalaSova (Pi'erov) and Dr. Milos Vitek (Pardubice).
Combinatorial Algorithms on Words refers to the collection of manipulations of strings of symbols (words) - not necessarily from a finite alphabet - that exploit the combinatorial properties of the logical/physical input arrangement to achieve efficient computational performances. The model of computation may be any of the established serial paradigms (e.g. RAM's, Turing Machines), or one of the emerging parallel models (e.g. PRAM, WRAM, Systolic Arrays, CCC). This book focuses on some of the accomplishments of recent years in such disparate areas as pattern matching, data compression, free groups, coding theory, parallel and VLSI computation, and symbolic dynamics; these share a common flavor, yet ltave not been examined together in the past. In addition to being theoretically interest ing, these studies have had significant applications. It happens that these works have all too frequently been carried out in isolation, with contributions addressing similar issues scattered throughout a rather diverse body of literature. We felt that it would be advantageous to both current and future researchers to collect this work in a sin gle reference. It should be clear that the book's emphasis is on aspects of combinatorics and com plexity rather than logic, foundations, and decidability. In view of the large body of research and the degree of unity already achieved by studies in the theory of auto mata and formal languages, we have allocated very little space to them."
"Kind of Cl'Ude ~ but it UJorks~ boy~ it UJOrksl" Alan Ner. ueH to Herb Simon~ C1rl'istmas 1955 In 1954 a computer program produced what appears to be the first computer generated mathematical proof: Written by M. Davis at the Institute of Advanced Studies, USA, it proved a number theoretic theorem in Presburger Arithmetic. Christmas 1955 heralded a computer program which generated the first proofs of some propositions of Principia Mathematica, developed by A. Newell, J. Shaw, and H. Simon at RAND Corporation, USA. In Sweden, H. Prawitz, D. Prawitz, and N. Voghera produced the first general program for the full first order predicate calculus to prove mathematical theorems; their computer proofs were obtained around 1957 and 1958, about the same time that H. Gelernter finished a computer program to prove simple high school geometry theorems. Since the field of computational logic (or automated theorem proving) is emerging from the ivory tower of academic research into real world applications, asserting also a definite place in many university curricula, we feel the time has come to examine and evaluate its history. The article by Martin Davis in the first of this series of volumes traces the most influential ideas back to the 'prehistory' of early logical thought showing how these ideas influenced the underlying concepts of most early automatic theorem proving programs.
Approximate reasoning is a key motivation in fuzzy sets and possibility theory. This volume provides a coherent view of this field, and its impact on database research and information retrieval. First, the semantic foundations of approximate reasoning are presented. Special emphasis is given to the representation of fuzzy rules and specialized types of approximate reasoning. Then syntactic aspects of approximate reasoning are surveyed and the algebraic underpinnings of fuzzy consequence relations are presented and explained. The second part of the book is devoted to inductive and neuro-fuzzy methods for learning fuzzy rules. It also contains new material on the application of possibility theory to data fusion. The last part of the book surveys the growing literature on fuzzy information systems. Each chapter contains extensive bibliographical material. Fuzzy Sets in Approximate Reasoning and Information Systems is a major source of information for research scholars and graduate students in computer science and artificial intelligence, interested in human information processing.
One high-level ability of the human brain is to understand what it has learned. This seems to be the crucial advantage in comparison to the brain activity of other primates. At present we are technologically almost ready to artificially reproduce human brain tissue, but we still do not fully understand the information processing and the related biological mechanisms underlying this ability. Thus an electronic clone of the human brain is still far from being realizable. At the same time, around twenty years after the revival of the connectionist paradigm, we are not yet satisfied with the typical subsymbolic attitude of devices like neural networks: we can make them learn to solve even difficult problems, but without a clear explanation of why a solution works. Indeed, to widely use these devices in a reliable and non elementary way we need formal and understandable expressions of the learnt functions. of being tested, manipulated and composed with These must be susceptible other similar expressions to build more structured functions as a solution of complex problems via the usual deductive methods of the Artificial Intelligence. Many effort have been steered in this directions in the last years, constructing artificial hybrid systems where a cooperation between the sub symbolic processing of the neural networks merges in various modes with symbolic algorithms. In parallel, neurobiology research keeps on supplying more and more detailed explanations of the low-level phenomena responsible for mental processes.
When solving real-life engineering problems, linguistic information is often encountered that is frequently hard to quantify using "classical" mathematical techniques. This linguistic information represents subjective knowledge. Through the assumptions made by the analyst when forming the mathematical model, the linguistic information is often ignored. On the other hand, a wide range of traffic and transportation engineering parameters are characterized by uncertainty, subjectivity, imprecision, and ambiguity. Human operators, dispatchers, drivers, and passengers use this subjective knowledge or linguistic information on a daily basis when making decisions. Decisions about route choice, mode of transportation, most suitable departure time, or dispatching trucks are made by drivers, passengers, or dispatchers. In each case the decision maker is a human. The environment in which a human expert (human controller) makes decisions is most often complex, making it difficult to formulate a suitable mathematical model. Thus, the development of fuzzy logic systems seems justified in such situations. In certain situations we accept linguistic information much more easily than numerical information. In the same vein, we are perfectly capable of accepting approximate numerical values and making decisions based on them. In a great number of cases we use approximate numerical values exclusively. It should be emphasized that the subjective estimates of different traffic parameters differs from dispatcher to dispatcher, driver to driver, and passenger to passenger.
This monograph contains the results of our joint research over the last ten years on the logic of the fixed point operation. The intended au dience consists of graduate students and research scientists interested in mathematical treatments of semantics. We assume the reader has a good mathematical background, although we provide some prelimi nary facts in Chapter 1. Written both for graduate students and research scientists in theoret ical computer science and mathematics, the book provides a detailed investigation of the properties of the fixed point or iteration operation. Iteration plays a fundamental role in the theory of computation: for example, in the theory of automata, in formal language theory, in the study of formal power series, in the semantics of flowchart algorithms and programming languages, and in circular data type definitions. It is shown that in all structures that have been used as semantical models, the equational properties of the fixed point operation are cap tured by the axioms describing iteration theories. These structures include ordered algebras, partial functions, relations, finitary and in finitary regular languages, trees, synchronization trees, 2-categories, and others."
As of today, Evolutionary Computing and Fuzzy Set Computing are two mature, wen -developed, and higbly advanced technologies of information processing. Bach of them has its own clearly defined research agenda, specific goals to be achieved, and a wen setUed algorithmic environment. Concisely speaking, Evolutionary Computing (EC) is aimed at a coherent population -oriented methodology of structural and parametric optimization of a diversity of systems. In addition to this broad spectrum of such optimization applications, this paradigm otTers an important ability to cope with realistic goals and design objectives reflected in the form of relevant fitness functions. The GA search (which is often regarded as a dominant domain among other techniques of EC such as evolutionary strategies, genetic programming or evolutionary programming) delivers a great deal of efficiency helping navigate through large search spaces. The main thrust of fuzzy sets is in representing and managing nonnumeric (linguistic) information. The key notion (whose conceptual as weH as algorithmic importance has started to increase in the recent years) is that of information granularity. It somewhat concurs with the principle of incompatibility coined by L. A. Zadeh. Fuzzy sets form a vehic1e helpful in expressing a granular character of information to be captured. Once quantified via fuzzy sets or fuzzy relations, the domain knowledge could be used efficiently very often reducing a heavy computation burden when analyzing and optimizing complex systems.
This volume contains papers which are based primarily on talks given at an inter national conference on Algorithmic Problems in Groups and Semigroups held at the University of Nebraska-Lincoln from May ll-May 16, 1998. The conference coincided with the Centennial Celebration of the Department of Mathematics and Statistics at the University of Nebraska-Lincoln on the occasion of the one hun dredth anniversary of the granting of the first Ph.D. by the department. Funding was provided by the US National Science Foundation, the Department of Math ematics and Statistics, and the College of Arts and Sciences at the University of Nebraska-Lincoln, through the College's focus program in Discrete, Experimental and Applied Mathematics. The purpose of the conference was to bring together researchers with interests in algorithmic problems in group theory, semigroup theory and computer science. A particularly useful feature of this conference was that it provided a framework for exchange of ideas between the research communities in semigroup theory and group theory, and several of the papers collected here reflect this interac tion of ideas. The papers collected in this volume represent a cross section of some of the results and ideas that were discussed in the conference. They reflect a synthesis of overlapping ideas and techniques stimulated by problems concerning finite monoids, finitely presented mono ids, finitely presented groups and free groups. |
![]() ![]() You may like...
Handbook of Solid State Diffusion…
Aloke Paul, Sergiy Divinski
Hardcover
R5,294
Discovery Miles 52 940
Advances in Materials, Mechanical and…
Prasanta Sahoo, J. Paulo Davim
Hardcover
R4,519
Discovery Miles 45 190
Recent Advances in Technologies for…
Anthony Lewis Brooks, Sheryl Brahnam, …
Hardcover
Reference for Modern Instrumentation…
R.N. Thurston, Allan D. Pierce
Hardcover
R4,342
Discovery Miles 43 420
The Finite Element Method: Theory…
Mats G. Larson, Fredrik Bengzon
Hardcover
R2,693
Discovery Miles 26 930
Fractional Differential Equations…
Angelamaria Cardone, Marco Donatelli, …
Hardcover
R4,573
Discovery Miles 45 730
CdTe and Related Compounds; Physics…
Robert Triboulet, Paul Siffert
Hardcover
R4,261
Discovery Miles 42 610
Strength of Materials and Structures
Carl T.F. Ross, John Case, …
Paperback
|