![]() |
![]() |
Your cart is empty |
||
Books > Professional & Technical > Electronics & communications engineering > General
Focuses on the common recurring physical principles behind sophisticated modern devices This book discusses the principles of physics through applications of state-of-the-art technologies and advanced instruments. The authors use diagrams, sketches, and graphs coupled with equations and mathematical analysis to enhance the reader s understanding of modern devices. Readers will learn to identify common underlying physical principles that govern several types of devices, while gaining an understanding of the performance trade-off imposed by the physical limitations of various processing methods. The topics discussed in the book assume readers have taken an introductory physics course, college algebra, and have a basic understanding of calculus. * Describes the basic physics behind a large number of devices encountered in everyday life, from the air conditioner to Blu-ray discs * Covers state-of-the-art devices such as spectrographs, photoelectric image sensors, spacecraft systems, astronomical and planetary observatories, biomedical imaging instruments, particle accelerators, and jet engines * Includes access to a book companion site that houses Power Point slides Modern Devices: The Simple Physics of Sophisticated Technology is designed as a reference for professionals that would like to gain a basic understanding of the operation of complex technologies. The book is also suitable as a textbook for upper-level undergraduate non-major students interested in physics.
The world has become digital and technological advances have multiplied circuits with access to data, their processing and their diffusion. New technologies have now reached a certain maturity. Data are available to everyone, anywhere on the planet. The number of Internet users in 2014 was 2.9 billion or 41% of the world population. The need for knowledge is becoming apparent in order to understand this multitude of data. We must educate, inform and train the masses. The development of related technologies, such as the advent of the Internet, social networks, "cloud-computing" (digital factories), has increased the available volumes of data. Currently, each individual creates, consumes, uses digital information: more than 3.4 million e-mails are sent worldwide every second, or 107,000 billion annually with 14,600 e-mails per year per person, but more than 70% are spam. Billions of pieces of content are shared on social networks such as Facebook, more than 2.46 million every minute. We spend more than 4.8 hours a day on the Internet using a computer, and 2.1 hours using a mobile. Data, this new ethereal manna from heaven, is produced in real time. It comes in a continuous stream from a multitude of sources which are generally heterogeneous. This accumulation of data of all types (audio, video, files, photos, etc.) generates new activities, the aim of which is to analyze this enormous mass of information. It is then necessary to adapt and try new approaches, new methods, new knowledge and new ways of working, resulting in new properties and new challenges since SEO logic must be created and implemented. At company level, this mass of data is difficult to manage. Its interpretation is primarily a challenge. This impacts those who are there to "manipulate" the mass and requires a specific infrastructure for creation, storage, processing, analysis and recovery. The biggest challenge lies in "the valuing of data" available in quantity, diversity and access speed.
A practical, step-by-step guide to designing world-class, high availability systems using both classical and DFSS reliability techniques Whether designing telecom, aerospace, automotive, medical, financial, or public safety systems, every engineer aims for the utmost reliability and availability in the systems he, or she, designs. But between the dream of world-class performance and reality falls the shadow of complexities that can bedevil even the most rigorous design process. While there are an array of robust predictive engineering tools, there has been no single-source guide to understanding and using them . . . until now. Offering a case-based approach to designing, predicting, and deploying world-class high-availability systems from the ground up, this book brings together the best classical and DFSS reliability techniques. Although it focuses on technical aspects, this guide considers the business and market constraints that require that systems be designed right the first time. Written in plain English and following a step-by-step "cookbook" format, "Designing High Availability Systems: "Shows how to integrate an array of design/analysis tools, including Six Sigma, Failure Analysis, and Reliability AnalysisFeatures many real-life examples and case studies describing predictive design methods, tradeoffs, risk priorities, "what-if" scenarios, and moreDelivers numerous high-impact takeaways that you can apply to your current projects immediatelyProvides access to MATLAB(R) programs for simulating problem sets presented, along with PowerPoint slides to assist in outlining the problem-solving process "Designing High Availability Systems" is an indispensable working resource for system engineers, software/hardware architects, and project teams working in all industries.
Supply chains for electronic products are primarily driven by consumer electronics. Every year new mobile phones, computers and gaming consoles are introduced, driving the continued applicability of Moore's law. The semiconductor manufacturing industry is highly dynamic and releases new, better and cheaper products day by day. But what happens to long-field life products like airplanes or ships, which need the same components for decades? How do electronic and also non-electronic systems that need to be manufactured and supported of decades manage to continue operation using parts that were available for a few years at most? This book attempts to answer these questions. This is the only book on the market that covers obsolescence forecasting methodologies, including forecasting tactics for hardware and software that enable cost-effective proactive product life-cycle management. This book describes how to implement a comprehensive obsolescence management system within diverse companies. "Strategies to the Prediction, Mitigation and Management of Product Obsolescence" is a must-have work for all professionals in product/project management, sustainment engineering and purchasing.
Teaches digital signal processing concepts via hands-on examples The OMAP-L138 eXperimenter is the latest inexpensive DSP development system to be adopted by the Texas Instruments University Program. The OMAP-L138 processor contains both ARM and DSP cores and is aimed at portable and mobile multimedia applications. This book concentrates on the demonstration of real-time DSP algorithms implemented on its C6748 DSP core. "Digital Signal Processing and Applications with the OMAP-L138 eXperimenter" provides an extensive and comprehensive set of program examples to aid instructors in teaching DSP in a laboratory using audio frequency signals--making it an ideal text for DSP courses at senior undergraduate and postgraduate levels. Subjects covered include polling-based, interrupt-based, and DMA-based I/O methods, and how real-time programs may be run using the board support library (BSL), the DSP/BIOS real-time operating system, or the DSP/BIOS Platform Support Package. Chapters include: Analog input and output with the OMAP-L138 eXperimenter Finite impulse response filters Infinite impulse response filters Fast Fourier transform Adaptive filters DSP/BIOS and platform support package Each chapter begins with a review of background theory and then presents a number of real-time program examples to reinforce understanding of that theory and to demonstrate the use of the OMAP-L138 eXperimenter and Texas Instruments Code Composer Studio integrated development environment.
A timely text on the recent developments in data storage, from a materials perspective Ever-increasing amounts of data storage on hard disk have been made possible largely due to the immense technological advances in the field of data storage materials. Developments in Data Storage: Materials Perspective covers the recent progress and developments in recording technologies, including the emerging non-volatile memory, which could potentially become storage technologies of the future. Featuring contributions from experts around the globe, this book provides engineers and graduate students in materials science and electrical engineering a solid foundation for grasping the subject. The book begins with the basics of magnetism and recording technology, setting the stage for the following chapters on existing methods and related research topics. These chapters focus on perpendicular recording media to underscore the current trend of hard disk media; read sensors, with descriptions of their fundamental principles and challenges; and write head, which addresses the advanced concepts for writing data in magnetic recording. Two chapters are devoted to the highly challenging area in hard disk drives of tribology, which deals with reliability, corrosion, and wear-resistance of the head and media. Next, the book provides an overview of the emerging technologies, such as heat-assisted magnetic recording and bit-patterned media recording. Non-volatile memory has emerged as a promising alternative storage option for certain device applications; two chapters are dedicated to non-volatile memory technologies such as the phase-change and the magnetic random access memories. With a strong focus on the fundamentals along with an overview of research topics, Developments in Data Storage is an ideal reference for graduate students or beginners in the field of magnetic recording. It also serves as an invaluable reference for future storage technologies including non-volatile memories.
This book provides the basics needed to develop sensor network software and supplements it with many case studies covering network applications. It also examines how to develop onboard applications on individual sensors, how to interconnect these sensors, and how to form networks of sensors, although the major aim of this book is to provide foundational principles of developing sensor networking software and critically examine sensor network applications.
A major update of solar cell technology and the solar marketplace Since the first publication of this important volume over a decade ago, dramatic changes have taken place with the solar market growing almost 100-fold and the U.S. moving from first to fourth place in the world market as analyzed in this Second Edition. Three bold new opportunities are identified for any countries wanting to improve market position. The first is combining pin solar cells with 3X concentration to achieve economic competitiveness near term. The second is charging battery-powered cars with solar cell-generated electricity from arrays in surrounding areas--including the car owners' homes--while simultaneously reducing their home electricity bills by over ninety percent. The third is formation of economic "unions" of sufficient combined economic size to be major competitors. In this updated edition, feed-in tariffs are identified as the most effective approach for public policy. Reasons are provided to explain why pin solar cells outperform more traditional pn solar cells. Field test data are reported for nineteen percent pin solar cells and for 500X concentrating systems with bare cell efficiencies approaching forty percent. Paths to bare cell efficiencies over fifty percent are described, and key missing program elements are identified. Since government support is needed for new technology prototype integration and qualification testing before manufacturing scale up, the key economic measure is identified in this volume as the electricity cost in cents per kilowatt-hour at the complete installed system level, rather than just the up-front solar cell modules' costs in dollars per watt. This Second Edition will benefit technologists in the fields of solar cells and systems; solar cell researchers; power systems designers; academics studying microelectronics, semiconductors, and solar cells; business students and investors with a technical focus; and government and political officials developing public policy.
An authoritative introduction to the roles of switching and transmission in broadband integrated services networks "Principles of Broadband Switching and Networking" explains the design and analysis of switch architectures suitable for broadband integrated services networks, emphasizing packet-switched interconnection networks with distributed routing algorithms. The text examines the mathematical properties of these networks, rather than specific implementation technologies. Although the pedagogical explanations in this book are in the context of switches, many of the fundamental principles are relevant to other communication networks with regular topologies. After explaining the concept of the modern broadband integrated services network and why it is necessary in today's society, the book moves on to basic switch design principles, discussing two types of circuit switch design--space domain and time domain--and packet switch design. Throughput improvements are illustrated by some switch design variations such as Speedup principle, Channel-Grouping principle, Knockout principle, and Dilation principle. Moving seamlessly into advanced switch design principles, the book covers switch scalability, switch design for multicasting, and path switching. Then the focus moves to broadband communications networks that make use of such switches. Readers receive a detailed introduction on how to allocate network resources and control traffic to satisfy the quality of service requirements of network users and to maximize network usage. As an epilogue, the text shows how transmission noise and packet contention have similar characteristics and can be tamed by comparable means to achieve reliable communication. "Principles of Broadband Switching and Networking" is written for senior undergraduate and first-year postgraduate students with a solid background in probability theory.
The only singular, all-encompassing textbook on state-of-the-art technical performance evaluation Fundamentals of Performance Evaluation of Computer and Telecommunication Systems uniquely presents all techniques of performance evaluation of computers systems, communication networks, and telecommunications in a balanced manner. Written by the renowned Professor Mohammad S. Obaidat and his coauthor Professor Noureddine Boudriga, it is also the only resource to treat computer and telecommunication systems as inseparable issues. The authors explain the basic concepts of performance evaluation, applications, performance evaluation metrics, workload types, benchmarking, and characterization of workload. This is followed by a review of the basics of probability theory, and then, the main techniques for performance evaluation--namely measurement, simulation, and analytic modeling--with case studies and examples. Contains the practical and applicable knowledge necessary for a successful performance evaluation in a balanced approach Reviews measurement tools, benchmark programs, design of experiments, traffic models, basics of queueing theory, and operational and mean value analysis Covers the techniques for validation and verification of simulation as well as random number generation, random variate generation, and testing with examples Features numerous examples and case studies, as well as exercises and problems for use as homework or programming assignments Fundamentals of Performance Evaluation of Computer and Telecommunication Systems is an ideal textbook for graduate students in computer science, electrical engineering, computer engineering, and information sciences, technology, and systems. It is also an excellent reference for practicing engineers and scientists.
A Research-Driven Resource on Building Biochemical Systems to Perform Information Processing Functions "Information Processing by Biochemical Systems" describes fully delineated biochemical systems, organized as neural network-type assemblies. It explains the relationship between these two apparently unrelated fields, revealing how biochemical systems have the advantage of using the "language" of the physiological processes and, therefore, can be organized into the neural network-type assemblies, much in the way that natural biosystems are. A wealth of information is included concerning both the experimental aspects (such as materials and equipment used) and the computational procedures involved. This authoritative reference: Addresses network-type connectivity, considered to be a key feature underlying the information processing ability of the brain Describes novel scientific achievements, and serves as an aid for those interested in further developing biochemical systems that will perform information-processing functions Provides a viable approach for furthering progress in the area of molecular electronics and biocomputing Includes results obtained in experimental studies involving a variety of real enzyme systems "Information Processing by Biochemical Systems" is intended for graduate students and professionals, as well as biotechnologists.
Demystifies FACTS controllers, offering solutions to power control and power flow problems Flexible alternating current transmission systems (FACTS) controllers represent one of the most important technological advances in recent years, both enhancing controllability and increasing power transfer capacity of electric power transmission networks. This timely publication serves as an applications manual, offering readers clear instructions on how to model, design, build, evaluate, and install FACTS controllers. Authors Kalyan Sen and Mey Ling Sen share their two decades of experience in FACTS controller research and implementation, including their own pioneering FACTS design breakthroughs. Readers gain a solid foundation in all aspects of FACTS controllers, including: Basic underlying theories Step-by-step evolution of FACTS controller development Guidelines for selecting the right FACTS controller Sample computer simulations in EMTP programming language Key differences in modeling such FACTS controllers as the voltage regulating transformer, phase angle regulator, and unified power flow controller Modeling techniques and control implementations for the three basic VSC-based FACTS controllers--STATCOM, SSSC, and UPFC In addition, the book describes a new type of FACTS controller, the Sen Transformer, which is based on technology developed by the authors. An appendix presents all the sample models that are discussed in the book, and the accompanying FTP site offers many more downloadable sample models as well as the full-color photographs that appear throughout the book. This book is essential reading for practitioners and students of power engineering around the world, offering viable solutions to the increasing problems of grid congestion and power flow limitations in electric power transmission systems.
EUVL is an area of intense research and this book provides the foundation required for understanding and applying this technology. It offers contributions from the world's leading EUVL researchers, and provides all the critical information needed by practitioners and those wanting to enter the field.
A clear, step-by-step approach to practical uses of discrete-signal analysis and design, especially for communications and radio engineers This book provides an introduction to discrete-time and discrete-frequency signal processing, which is rapidly becoming an important, modern way to design and analyze electronics projects of all kinds. It presents discrete-signal processing concepts from the perspective of an experienced electronics or radio engineer, which is especially meaningful for practicing engineers, technicians, and students. The approach is almost entirely mathematical, but at a level that is suitable for undergraduate curriculums and also for independent, at-home study using a personal computer. Coverage includes: First principles, including the Discrete Fourier Transform (DFT) Sine, cosine, and theta Spectral leakage and aliasing Smoothing and windowing Multiplication and convolution Probability and correlation Power spectrum Hilbert transform The accompanying CD-ROM includes Mathcad(R) v.14 Academic Edition, which is reproduced with permission and has no time limitation for use, providing users with a sophisticated and world-famous tool for a wide range of applied mathematics capabilities. Discrete-Signal Analysis and Design is written in an easy-to-follow, conversational style and supplies readers with a solid foundation for more advanced literature and software. It employs occasional re-examination and reinforcement of particularly important concepts, and each chapter contains self-study examples and full-page Mathcad(R) Worksheets, worked-out and fully explained.
This two-volume handbook offers a comprehensive and coordinated
presentation of SQUIDs (Superconducting Quantum Interference
Devices), including device fundamentals, design, technology, system
construction and multiple applications. It is intended to bridge
the gap between fundamentals and applications, and will be a
valuable textbook reference for graduate students and for
professionals engaged in SQUID research and engineering. It will
also be of use to specialists in multiple fields of practical SQUID
applications, from human brain research and heart diagnostics to
airplane and nuclear plant testing to prospecting for oil, minerals
and buried ordnance.
A definitive guide for accurate state-of-the-art modelling of free surface flows. Understanding the dynamics of free surface flows is the starting point of many environmental studies, impact studies, and waterworks design. Typical applications, once the flows are known, are water quality, dam impact and safety, pollutant control, and sediment transport. These studies used to be done in the past with scale models, but these are now being replaced by numerical simulation performed by software suites called "hydro-informatic systems." The Telemac system is the leading software package worldwide, and has been developed by Electricite de France and Jean-Michel Hervouet, who is the head and main developer of the Telemac project. Written by a leading authority on Computational Fluid Dynamics, the book aims to provide environmentalists, hydrologists, and engineers using hydro-informatic systems such as Telemac and the finite element method, with the knowledge of the basic principles, capabilities, different hypotheses, and limitations. In particular this book: presents the theory for understanding hydrodynamics through an extensive array of case studies such as tides, tsunamis, storm surges, floods, bores, dam break flood waves, density driven currents, hydraulic jumps, making this a principal reference on the topic; gives a detailed examination and analysis of the notorious Malpasset dam failure; includes a coherent description of finite elements in shallow water; delivers a significant treatment of the state-of-the-art flow modelling techniques using Telemac, developed by Electricite de France; and provides the fundamental physics and theory of free surface flows to be utilised bycourses on environmental flows. Hydrodynamics of Free Surface Flows is essential reading for those involved in computational fluid dynamics and environmental impact assessments, as well as hydrologists, and bridge, coastal and dam engineers. Guiding readers from fundamental theory to the more advanced topics in the application of the finite element method and the Telemac System, this book is a key reference for a broad audience of students, lecturers, researchers and consultants, right through to the community of users of hydro-informatics systems.
Theoretical and practical tools to master matrix code design strategy and technique Error correcting and detecting codes are essential to improving
system reliability and have popularly been applied to computer
systems and communication systems. Coding theory has been studied
mainly using the code generator polynomials; hence, the codes are
sometimes called polynomial codes. On the other hand, the codes
designed by parity check matrices are referred to in this book as
matrix codes. This timely book focuses on the design theory for
matrix codes and their practical applications for the improvement
of system reliability. As the author effectively demonstrates,
matrix codes are far more flexible than polynomial codes, as they
are capable of expressing various types of code functions.
JPL spacecraft antennas-from the first Explorer satellite in 1958
to current R & D
The definitive text on microwave ring circuits-now better than
ever
This book is the first pedagogical synthesis of the field of topological insulators and superconductors, one of the most exciting areas of research in condensed matter physics. Presenting the latest developments, while providing all the calculations necessary for a self-contained and complete description of the discipline, it is ideal for researchers and graduate students preparing to work in this area, and it will be an essential reference both within and outside the classroom. The book begins with the fundamental description on the topological phases of matter such as one, two- and three-dimensional topological insulators, and methods and tools for topological material's investigations, topological insulators for advanced optoelectronic devices, topological superconductors, saturable absorber and in plasmonic devices. Advanced Topological Insulators provides researchers and graduate students with the physical understanding and mathematical tools needed to embark on research in this rapidly evolving field.
Your guide to advanced thermoelectric materials Written by a distinguished group of contributors, this book provides comprehensive coverage of the most up-to-date information on all aspects of advanced thermoelectric materials -- ranging from system biology, diagnostics, imaging, image-guided therapy, therapeutics, biosensors, and translational medicine and personalized medicine, as well as the much broader task of covering most topics of biomedical research.
Light Emitting Diodes (LEDs) are no longer confined to use in commercial signage and have now moved firmly, and with unquestioned advantages, into the field of commercial and domestic lighting. This development was prompted in the late 1980s by the invention of the blue LED, a wavelength that had previously been missing from the available LED spectrum and which opened the way to providing white light. Since that point, LED performance (including energy efficiency) has improved dramatically, and now compares with the performance of fluorescent lights - and there remain further performance improvements yet to be delivered. The book begins with the principles of LED lighting, then
focuses on issues and challenges. Chapters are devoted to key steps
in LED manufacturing: substrate, epitaxy, process and packaging.
Photoelectric characterization of LEDs, Lighting with LEDs and the
imposition of a certain level of color quality, are the subject of
later chapters, and finally there is a detailed discussion of the
emergence of OLEDs, or organic LEDs, which have specific
capabilities of immediate interest and importance in this
field.
David L. Morton examines the process of invention, innovation, and diffusion of communications technology, using the history of sound recording as the focus. Off the Record demonstrates how the history of both the hardware and the ways people used it is essential for understanding why any particular technology became a fixture in everyday life or faded into obscurity. Morton's approach to the topic differs from most previous works, which have examined the technology's social impact, but not the reasons for its existence. Recording culture in America emerged, Morton writes, not through the dictates of the technology itself but in complex ways that were contingent upon the actions of users.Each of the case studies in the book emphasizes one of five aspects of the culture of recording and its relationship to new technology, at the same time telling the story of sound recording history. One of the misconceptions that Morton hopes to dispel is that the only important category of sound recording involves music. Unique in his broad-based approach to sound technology, the five case studies that Morton investigates are : The phonograph record Recording in the radio business The dictation machine The telephone answering machine, and Home taping Readers will learn, for example, that the equipment to create the telephone answering machine has been around for a century, but that the ownership and use of answering machines was a hotly contested issue in the telephone industry at the turn of the century, hence stifling its commercial development for decades. Morton also offers fascinating insight into early radio: that, while The Amos and Andy Show initially was pre-recorded and not broadcast live, the commercial stations saw this easily distributed program as an economic threat: many non-network stations could buy the disks for easy, relatively inexpensive replaying. As a result, Amos and Andy was sold to Mutual and went live shortly afterward. |
![]() ![]() You may like...
Real-Time Ground-Based Flight Data and…
Mustafa M. Matalgah, Mohammed Ali Alqodah
Hardcover
R3,132
Discovery Miles 31 320
Modern Control Systems, Global Edition
Richard Dorf, Robert Bishop
Paperback
R2,652
Discovery Miles 26 520
Robot Modeling and Control
Mark W. Spong, Seth Hutchinson, …
Hardcover
Electrical Properties of Materials
Laszlo Solymar, Donald Walsh, …
Hardcover
R4,626
Discovery Miles 46 260
|