![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Professional & Technical > Technology: general issues > Technical design > Computer aided design (CAD)
This book tackles head-on the challenges of digital design in the era of billion-transistor SoCs. It discusses fundamental design concepts in design and coding required to produce robust, functionally correct designs. It also provides specific techniques for measuring and minimizing complexity in RTL code. Finally, it discusses the tradeoff between RTL and high-level (C-based) design and how tools and languages must progress to address the needs of tomorrow's SoC designs.
This book presents new results on applications of geometric algebra. The time when researchers and engineers were starting to realize the potential of quaternions for - plications in electrical, mechanic, and control engineering passed a long time ago. Since the publication of Space-Time Algebra by David Hestenes (1966) and Clifford Algebra to Geometric Calculus: A Uni?ed Language for Mathematics and Physics by David Hestenes and Garret Sobczyk (1984), consistent progress in the app- cations of geometric algebra has taken place. Particularly due to the great dev- opments in computer technology and the Internet, researchers have proposed new ideas and algorithms to tackle a variety of problems in the areas of computer science and engineering using the powerful language of geometric algebra. In this process, pioneer groups started the conference series entitled "Applications of Geometric Algebra in Computer Science and Engineering" (AGACSE) in order to promote the research activity in the domain of the application of geometric algebra. The ?rst conference, AGACSE'1999, organized by Eduardo Bayro-Corrochano and Garret Sobczyk, took place in Ixtapa-Zihuatanejo, Mexico, in July 1999. The contri- tions were published in Geometric Algebra with Applications in Science and En- neering, Birkhauser, 2001. The second conference, ACACSE'2001, was held in the Engineering Department of the Cambridge University on 9-13 July 2001 and was organizedbyLeoDorst,ChrisDoran,andJoanLasenby. Thebestconferencecont- butions appeared as a book entitled Applications of Geometric Algebra in Computer Science and Engineering, Birkhauser, 2002. The third conference, AGACSE'2008, took place in August 2008 in Grimma, Leipzig, Germany.
Cartesian Genetic Programming (CGP) is a highly effective and increasingly popular form of genetic programming. It represents programs in the form of directed graphs, and a particular characteristic is that it has a highly redundant genotype-phenotype mapping, in that genes can be noncoding. It has spawned a number of new forms, each improving on the efficiency, among them modular, or embedded, CGP, and self-modifying CGP. It has been applied to many problems in both computer science and applied sciences. This book contains chapters written by the leading figures in the development and application of CGP, and it will be essential reading for researchers in genetic programming and for engineers and scientists solving applications using these techniques. It will also be useful for advanced undergraduates and postgraduates seeking to understand and utilize a highly efficient form of genetic programming.
The importance of research and education in design continues to grow. For example, government agencies are gradually increasing funding of design research, and increasing numbers of engineering schools are revising their curricula to emphasize design. This is because of an increasing realization that design is part of the wealth creation of a nation and needs to be better understood and taught. The continuing globalization of industry and trade has required nations to re-examine where their core contributions lie if not in production efficiency. Design is a precursor to manufacturing for phy- cal objects and is the precursor to implementation for virtual objects. At the same time, the need for sustainable development is requiring design of new products and processes, and feeding a movement towards design - novations and inventions. There are now three sources for design research: design computing, design cognition and human-centered information technology. The foun- tions for much of design computing remains artificial intelligence with its focus on ways of representation and on processes that support simulation and generation. Artificial intelligence continues to provide an environm- tally rich paradigm within which design research based on computational constructions can be carried out. Design cognition is founded on concepts from cognitive science, an even newer area than artificial intelligence. It provides tools and methods to study human designers in both laboratory and practice settings.
This volume constitutes the thoroughly refereed post-conference proceedings of the 8th International Conference on Mathematical Methods for Curves and Surfaces, MMCS 2012, held in Oslo, Norway, in June/July 2012. The 28 revised full papers presented were carefully reviewed and selected from 135 submissions. The topics range from mathematical analysis of various methods to practical implementation on modern graphics processing units. The papers reflect the newest developments in these fields and also point to the latest literature.
From the reviews: "[...] a welcome addition to the literature. [...] This book promises to make a valuable contribution to the education of graduate students in electrical and computer engineering, and a very useful addition to the library of the maturer investigator in SoC designs or related fields." Microelectronics Reliability
Computation and communication technologies underpin work and development in many different areas. Among them, Computer-Aided Design of electronic systems and eLearning technologies are two areas which, though different, in fact share many concerns. The design of CAD and eLearning systems already touches on a number of parallels, such as system interoperability, user interfaces, standardisation, XML-based formats, reusability aspects, etc. Furthermore, the teaching of Design Automation tools and methods is particularly amenable to a distant or blended learning setting, and implies the interconnection of typical CAD tools, such as simulators or synthesis tools, with eLearning tools. There are many other aspects in which synergy can be found when using eLearning technology for teaching and learning technology. EduTech: Computer-Aided Design Meets Computer-Aided Learning contains the proceedings of the EduTech2004 workshop, which was held in August 2004 in conjunction with the 18th IFIP World Computer Congress in Toulouse, France, and sponsored by the International Federation for Information Processing (IFIP). Organized by IFIP WG 10.5 (Design and Engineering of Electronic Systems) in cooperation with IFIP WG 3.6 (Distance Education), the workshop proceedings explore the interrelationship between these two subjects, where computer-aided design meets computer-aided learning. The book includes papers related to eLearning in the area of electronic CAD, but also includes contributions tackling general issues of eLearning that are applicable to this and many other areas such as reusability, standards, open source tools or mobility. This book will be of value to those interested in the latest developments in eLearning in general, and also to those coming from the electronic design field who want to know how to apply these developments in their area.
Synthesis and Optimization of DSP Algorithms describes approaches taken to synthesising structural hardware descriptions of digital circuits from high-level descriptions of Digital Signal Processing (DSP) algorithms. The book contains: -A tutorial on the subjects of digital design and architectural synthesis, intended for DSP engineers, -A tutorial on the subject of DSP, intended for digital designers, -A discussion of techniques for estimating the peak values likely to occur in a DSP system, thus enabling an appropriate signal scaling. Analytic techniques, simulation techniques, and hybrids are discussed. The applicability of different analytic approaches to different types of DSP design is covered, -The development of techniques to optimise the precision requirements of a DSP algorithm, aiming for efficient implementation in a custom parallel processor. The idea is to trade-off numerical accuracy for area or power-consumption advantages. Again, both analytic and simulation techniques for estimating numerical accuracy are described and contrasted. Optimum and heuristic approaches to precision optimisation are discussed, -A discussion of the importance of the scheduling, allocation, and binding problems, and development of techniques to automate these processes with reference to a precision-optimized algorithm, -Future perspectives for synthesis and optimization of DSP algorithms.
Analog Behavioral Modeling With The Verilog-A Language provides the IC designer with an introduction to the methodologies and uses of analog behavioral modeling with the Verilog-A language. In doing so, an overview of Verilog-A language constructs as well as applications using the language are presented. In addition, the book is accompanied by the Verilog-A Explorer IDE (Integrated Development Environment), a limited capability Verilog-A enhanced SPICE simulator for further learning and experimentation with the Verilog-A language. This book assumes a basic level of understanding of the usage of SPICE-based analog simulation and the Verilog HDL language, although any programming language background and a little determination should suffice. From the Foreword: `Verilog-A is a new hardware design language (HDL) for analog circuit and systems design. Since the mid-eighties, Verilog HDL has been used extensively in the design and verification of digital systems. However, there have been no analogous high-level languages available for analog and mixed-signal circuits and systems. Verilog-A provides a new dimension of design and simulation capability for analog electronic systems. Previously, analog simulation has been based upon the SPICE circuit simulator or some derivative of it. Digital simulation is primarily performed with a hardware description language such as Verilog, which is popular since it is easy to learn and use. Making Verilog more worthwhile is the fact that several tools exist in the industry that complement and extend Verilog's capabilities ... Behavioral Modeling With the Verilog-A Language provides a good introduction and starting place for students and practicing engineers with interest in understanding this new level of simulation technology. This book contains numerous examples that enhance the text material and provide a helpful learning tool for the reader. The text and the simulation program included can be used for individual study or in a classroom environment ...' Dr. Thomas A. DeMassa, Professor of Engineering, Arizona State University
This unique book provides an overview of the current state of the art and very recent research results that have been achieved as part of the Low-Power Initiative of the European Union, in the field of analogue, RF and mixed-signal design methodologies and CAD tools.
Behavioral Synthesis: A Practical Guide to High-Level Design includes details on new material and new interpretations of old material with an emphasis on practical information. The intended audience is the ASIC (or high-end FPGA) designer who will be using behavioral synthesis, the manager who will be working with those designers, or the engineering student who is studying leading-edge design techniques. Today's designs are creating tremendous pressures for digital designers. Not only must they compress more functionality onto a single IC, but this has to be done on shorter schedules to stay ahead in extremely competitive markets. To meet these opposing demands, designers must work at a new, higher level of abstraction to efficiently make the kind of architectural decisions that are critical to the success of today's complex designs. In other words, they must include behavioral design in their flow. The biggest challenge to adopting behavioral design is changing the mindset of the designer. Instead of describing system functionality in great detail, the designer outlines the design in broader, more abstract terms. The ability to easily and efficiently consider multiple design alternatives over a wide range of cost and performance is an extremely persuasive reason to make this leap to a high level of abstraction. Designers that learn to think and work at the behavioral level will reap major benefits in the resultant quality of the final design. But such changes in methodology are difficult to achieve rapidly. Education is essential to making this transition. Many designers will recall the difficulty transitioning from schematic-based design to RTL design. Designers that were new to the technology often felt that they had not been told enough about how synthesis worked and that they were not taught how to effectively write HDL code that would synthesize efficiently. Using this unique book, a designer will understand what behavioral synthesis tools are doing (and why) and how to effectively describe their designs that they are appropriately synthesized. CD ROM INCLUDED! The accompanying CD-ROM contains the source code and test benches for the three case studies discussed in Chapters 14, 15 and 16.
Regular Nanofabrics in Emerging Technologies gives a deep insight into both fabrication and design aspects of emerging semiconductor technologies, that represent potential candidates for the post-CMOS era. Its approach is unique, across different fields, and it offers a synergetic view for a public of different communities ranging from technologists, to circuit designers, and computer scientists. The book presents two technologies as potential candidates for future semiconductor devices and systems and it shows how fabrication issues can be addressed at the design level and vice versa. The reader either for academic or research purposes will find novel material that is explained carefully for both experts and non-initiated readers. Regular Nanofabrics in Emerging Technologies is a survey of post-CMOS technologies. It explains processing, circuit and system level design for people with various backgrounds.
From the reviews: "This book crystallizes what may become a defining moment in the electronics industry - the shift to platform-based design. It provides the first comprehensive guidebook for those who will build, and use, the integration platforms that may soon drive the system-on-chip revolution." Electronic Engineering Times
The Verilog Hardware Description Language (Verilog-HDL) has long been the most popular language for describing complex digital hardware. It started life as a prop- etary language but was donated by Cadence Design Systems to the design community to serve as the basis of an open standard. That standard was formalized in 1995 by the IEEE in standard 1364-1995. About that same time a group named Analog Verilog International formed with the intent of proposing extensions to Verilog to support analog and mixed-signal simulation. The first fruits of the labor of that group became available in 1996 when the language definition of Verilog-A was released. Verilog-A was not intended to work directly with Verilog-HDL. Rather it was a language with Similar syntax and related semantics that was intended to model analog systems and be compatible with SPICE-class circuit simulation engines. The first implementation of Verilog-A soon followed: a version from Cadence that ran on their Spectre circuit simulator. As more implementations of Verilog-A became available, the group defining the a- log and mixed-signal extensions to Verilog continued their work, releasing the defi- tion of Verilog-AMS in 2000. Verilog-AMS combines both Verilog-HDL and Verilog-A, and adds additional mixed-signal constructs, providing a hardware description language suitable for analog, digital, and mixed-signal systems. Again, Cadence was first to release an implementation of this new language, in a product named AMS Designer that combines their Verilog and Spectre simulation engines.
Writing Testbenches: Functional Verification of HDL Models first introduces the necessary concepts and tools of verification, then describes a process for carrying out an effective functional verification of a design. This book also presents techniques for applying a stimulus and monitoring the response of a design by abstracting the operations using bus-functional models. The architecture of testbenches built around these bus-functional models is important for minimizing development and maintenance effort. Behavioral modeling is another important concept presented in this book. It is used to parallelize the implementation and verification of a design and to perform more efficient simulations. For many, behavioral modeling is synonymous with synthesizeable or RTL modeling. In this book, the term 'behavioural' is used to describe any model that adequately emulates the functionality of a design, usually using non-synthesizeable constructs and coding style. Writing Testbenches: Functional Verification of HDL Models focuses on the functional verification of hardware designs using either VHDL or Verilog.The reader should have at least a basic knowledge of one of the languages. Ideally, he or she should have experience in writing synthesizeable models and be familiar with running a simulation using any of the available VHDL or Verilog simulators. From the Foreword 'With gate counts and system complexity growing exponentially, engineers confront the most perplexing challenge in product design: functional verification. The bulk of the time consumed in the design of new ICs and systems is now spent on verification. New and interesting design technologies like physical synthesis and design reuse that create ever- larger designs only aggravate the problem. What the EDA tool industry has continuously failed to realize is that the real problem is not how to create a 12 million gate IC that runs at 600 MHz, but how to verify it. This text marks the first genuine effort at defining a verification methodology that is independent of both tools and applications. Engineers now have a true reference text for quickly and accurately verifying the functionality of their designs.' Michael Horne, President and CEO, Qualis Design Corporation
This book constitutes the refereed proceedings of the 11th International Conference on Cooperative Design, Visualization, and Engineering, CDVE 2014, held in Seattle, WA, USA, in September 2014. The 33 full and 10 short papers presented were carefully reviewed and selected from 78 submissions. The papers cover topics such as cloud technology; the use of cloud for manufacturing, re-source selection, service evaluation, and control; methods for processing and visualizing big data created by the social media, such as Twitter and Facebook; real-time data about human interaction; sentiment analysis; trend analysis; location-based crowdsourcing; effective teamwork; cooperative visualization.
What is 'design creativity'? It is impossible to answer this question without considering why human beings can - and do - 'design'. Design creativity is instrumental in not only addressing social problems faced across the world, but also evoking an innate appreciation for beauty and a sense of personal contentment. Design Creativity 2010 comprises advanced research findings on design creativity and perspectives on future directions of design creativity research. The papers included were presented and discussed at the first ICDC (International Conference on Design Creativity), which was held at Kobe, Japan, in 2010. Design Creativity 2010 encourages readers to enhance and expand their activities in the field of design creativity.
The modern electronic testing has a forty year history. Test professionals hold some fairly large conferences and numerous workshops, have a journal, and there are over one hundred books on testing. Still, a full course on testing is offered only at a few universities, mostly by professors who have a research interest in this area. Apparently, most professors would not have taken a course on electronic testing when they were students. Other than the computer engineering curriculum being too crowded, the major reason cited for the absence of a course on electronic testing is the lack of a suitable textbook. For VLSI the foundation was provided by semiconductor device techn- ogy, circuit design, and electronic testing. In a computer engineering curriculum, therefore, it is necessary that foundations should be taught before applications. The field of VLSI has expanded to systems-on-a-chip, which include digital, memory, and mixed-signalsubsystems. To our knowledge this is the first textbook to cover all three types of electronic circuits. We have written this textbook for an undergraduate "foundations" course on electronic testing. Obviously, it is too voluminous for a one-semester course and a teacher will have to select from the topics. We did not restrict such freedom because the selection may depend upon the individual expertise and interests. Besides, there is merit in having a larger book that will retain its usefulness for the owner even after the completion of the course. With equal tenacity, we address the needs of three other groups of readers.
In the past decade, substrate noise has had a constant and significant impact on the design of analog and mixed-signal integrated circuits. Only recently, with advances in chip miniaturization and innovative circuit design, has substrate noise begun to plague fully digital circuits as well. To combat the effects of substrate noise, heavily over-designed structures are generally adopted, thus seriously limiting the advantages of innovative technologies. Substrate Noise: Analysis and Optimization for IC Design addresses the main problems posed by substrate noise from both an IC and a CAD designer perspective. The effects of substrate noise on performance in digital, analog, and mixed-signal circuits are presented, along with the mechanisms underlying noise generation, injection, and transport. Popular solutions to the substrate noise problem and the trade-offs often debated by designers are extensively discussed. Non-traditional approaches as well as semi-automated techniques to combat substrate noise are also addressed. Substrate Noise: Analysis and Optimization for IC Design will be of interest to researchers and professionals interested in signal integrity, as well as to mixed signal and RF designers.
Over the last 15 years, the application of innovative steel concepts in the automotive industry has increased steadily. Numerical simulation technology of hot forming of high-strength steel allows engineers to modify the formability of hot forming steel metals and to optimize die design schemes. Theories, Methods and Numerical Technology of Sheet Metal Cold and Hot Forming focuses on hot and cold forming theories, numerical methods, relative simulation and experiment techniques for high-strength steel forming and die design in the automobile industry. Theories, Methods and Numerical Technology of Sheet Metal Cold and Hot Forming introduces the general theories of cold forming, then expands upon advanced hot forming theories and simulation methods, including: the forming process, constitutive equations, hot boundary constraint treatment, and hot forming equipment and experiments. Various calculation methods of cold and hot forming, based on the authors' experience in commercial CAE software for sheet metal forming, are provided, as well as a discussion of key issues, such as hot formability with quenching process, die design and cooling channel design in die, and formability experiments. Theories, Methods and Numerical Technology of Sheet Metal Cold and Hot Forming will enable readers to develop an advanced knowledge of hot forming, as well as to apply hot forming theories, calculation methods and key techniques to direct their die design. It is therefore a useful reference for students and researchers, as well as automotive engineers.
Computing increasingly happens somewhere, with that geographic location important to the computational process itself. Many new and evolving spatial technologies, such as geosensor networks and smartphones, embody this trend. Conventional approaches to spatial computing are centralized, and do not account for the inherently decentralized nature of "computing somewhere": the limited, local knowledge of individual system components, and the interaction between those components at different locations. On the other hand, despite being an established topic in distributed systems, decentralized computing is not concerned with geographical constraints to the generation and movement of information. In this context, of (centralized) spatial computing and decentralized (non-spatial) computing, the key question becomes: "What makes decentralized spatial computing special?" In Part I of the book the author covers the foundational concepts, structures, and design techniques for decentralized computing with spatial and spatiotemporal information. In Part II he applies those concepts and techniques to the development of algorithms for decentralized spatial computing, stepping through a suite of increasingly sophisticated algorithms: from algorithms with minimal spatial information about their neighborhoods; to algorithms with access to more detailed spatial information, such as direction, distance, or coordinate location; to truly spatiotemporal algorithms that monitor environments that are dynamic, even using networks that are mobile or volatile. Finally, in Part III the author shows how decentralized spatial and spatiotemporal algorithms designed using the techniques explored in Part II can be simulated and tested. In particular, he investigates empirically the important properties of a decentralized spatial algorithm: its computational efficiency and its robustness to unavoidable uncertainty. Part III concludes with a survey of the opportunities for connecting decentralized spatial computing to ongoing research and emerging hot topics in related fields, such as biologically inspired computing, geovisualization, and stream computing. The book is written for students and researchers of computer science and geographic information science. Throughout the book the author's style is characterized by a focus on the broader message, explaining the process of decentralized spatial algorithm design rather than the technical details. Each chapter ends with review questions designed to test the reader's understanding of the material and to point to further work or research. The book includes short appendices on discrete mathematics and SQL. Simulation models written in NetLogo and associated source code for all the algorithms presented in the book can be found on the author's accompanying website.
I am glad to see this new book on the e language and on verification. I am especially glad to see a description of the e Reuse Methodology (eRM). The main goal of verification is, after all, finding more bugs quicker using given resources, and verification reuse (module-to-system, old-system-to-new-system etc. ) is a key enabling component. This book offers a fresh approach in teaching the e hardware verification language within the context of coverage driven verification methodology. I hope it will help the reader und- stand the many important and interesting topics surrounding hardware verification. Yoav Hollander Founder and CTO, Verisity Inc. Preface This book provides a detailed coverage of the e hardware verification language (HVL), state of the art verification methodologies, and the use of e HVL as a facilitating verification tool in implementing a state of the art verification environment. It includes comprehensive descriptions of the new concepts introduced by the e language, e language syntax, and its as- ciated semantics. This book also describes the architectural views and requirements of verifi- tion environments (randomly generated environments, coverage driven verification environments, etc. ), verification blocks in the architectural views (i. e. generators, initiators, c- lectors, checkers, monitors, coverage definitions, etc. ) and their implementations using the e HVL. Moreover, the e Reuse Methodology (eRM), the motivation for defining such a gui- line, and step-by-step instructions for building an eRM compliant e Verification Component (eVC) are also discussed.
Regular Fabrics in Deep Sub-Micron Integrated-Circuit Design discusses new approaches to better timing-closure and manufacturability of DSM Integrated Circuits. The key idea presented is the use of regular circuit and interconnect structures such that area/delay can be predicted with high accuracy. The co-design of structures and algorithms allows great opportunities for achieving better final results, thus closing the gap between IC and CAD designers. The regularities also provide simpler and possibly better manufacturability. In this book we present not only algorithms for solving particular sub-problems but also systematic ways of organizing different algorithms in a flow to solve the design problem as a whole. A timing-driven chip design flow is developed based on the new structures and their design algorithms, which produces faster chips in a shorter time.
As integrated circuit (IC) feature sizes scaled below a quarter of a micron, thereby defining the deep submicron (DSM) era, there began a gradual shift in the impact on performance due to the metal interconnections among the active circuit components. Once viewed as merely parasitics in terms of their relevance to the overall circuit behavior, the interconnect can now have a dominant impact on the IC area and performance. Beginning in the late 1980's there was significant research toward better modeling and characterization of the resistance, capacitance and ultimately the inductance of on-chip interconnect. IC Interconnect Analysis covers the state-of-the-art methods for modeling and analyzing IC interconnect based on the past fifteen years of research. This is done at a level suitable for most practitioners who work in the semiconductor and electronic design automation fields, but also includes significant depth for the research professionals who will ultimately extend this work into other areas and applications. IC Interconnect Analysis begins with an in-depth coverage of delay metrics, including the ubiquitous Elmore delay and its many variations. This is followed by an outline of moment matching methods, calculating moments efficiently, and Krylov subspace methods for model order reduction. The final two chapters describe how to interface these reduced-order models to circuit simulators and gate-level timing analyzers respectively. IC Interconnect Analysis is written for CAD tool developers, IC designers and graduate students.
The proliferation and growth of Electronic Design Automation (EDA) has spawned many diverse and interesting technologies. One of the most prominent of these technologies is the VHSIC Hardware Description Language, or VHDL. VHDL permits designers of digital modules, components, systems, and even networks to describe their designs both structurally and behaviorally. VHDL also allows simulation of the designs in order to investigate their performance prior to actually implementing them in hardware. Having gained the ability to simulate designs once encoded in VHDL, designers were naturally confronted with the issue of testing these designs. VHDL did not explicitly address the requirement to insert particular digital waveforms, often termed test vectors or patterns, or to subsequently assess the correctness of the response from some digital entity. In a distributed design environment, or even in an isolated one where the design was subject to review or scrutiny by another organization, de-facto methods of testing and evaluating results proved faulty. The reason was a lack of standardization.When organization A designed a circuit and tested it with their self-developed test tools it had a certain behavior. When it was delivered to organization B and B tested it using their test tools, the behavior was different. Was the fault in the circuit, in A's tools, or in B's tools? The only way to resolve this was for both organizations to agree on a test apparatus, validate its correctness and use it consistently. While VHDL was an IEEE standard language, and consistency among myriad designers was fairly well guaranteed, no such standard existed for test waveform generation and assessment. Hence, the value of standardization in the design language was being negated by the lack of such a standard for testing. The Waveform and Vector Exchange Specification, or WAVES, was conceived and designed to solve this testing problem -- and it has. Being both a subset of VHDL itself, as well as an IEEE standard, it guarantees both conformity among multiple applications and easy integration with VHDL units under test (UUTs). Using WAVES and VHDL for Effective Design and Testing will serve many purposes.For the WAVES beginner, its tutorial will make the application of WAVES in typical, standard usage straightforward and convenient. For the more advanced user, the advanced topics will provide insight into the nuances of these useful capabilities. For all users, the tools, templates and examples given in the chapters, as well as on the companion disk, will provide a practical starting foundation for using WAVES and VHDL. |
You may like...
Shocks in Astrophysics - Proceedings of…
T.J. Millar, A.C. Raga
Hardcover
R4,197
Discovery Miles 41 970
The Evolution of Galaxies…
Jose M. Vilchez, Grazyna Stasinska, …
Hardcover
R5,494
Discovery Miles 54 940
From Twilight to Highlight: The Physics…
Wolfgang Hillebrandt, Bruno Leibundgut
Hardcover
R1,307
Discovery Miles 13 070
|