Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Professional & Technical > Electronics & communications engineering > Communications engineering / telecommunications > General
This book gives a thorough explanation of standardization, its processes, its life cycle, and its related organization on a national, regional and global level. The book provides readers with an insight in the interaction cycle between standardization organizations, government, industry, and consumers. The readers can gain a clear insight to standardization and innovation process, standards, and innovations life-cycle and the related organizations with all presented material in the field of information and communications technologies. The book introduces the reader to understand perpetual play of standards and innovation cycle, as the basis for the modern world.
Codes, Curves, and Signals: Common Threads in Communications is a collection of seventeen contributions from leading researchers in communications. The book provides a representative cross-section of cutting edge contemporary research in the fields of algebraic curves and the associated decoding algorithms, the use of signal processing techniques in coding theory, and the application of information-theoretic methods in communications and signal processing. The book is organized into three parts: Curves and Codes, Codes and Signals, and Signals and Information. Codes, Curves, and Signals: Common Threads in Communications is a tribute to the broad and profound influence of Richard E. Blahut on the fields of algebraic coding, information theory, and digital signal processing. All the contributors have individually and collectively dedicated their work to R. E. Blahut. Codes, Curves, and Signals: Common Threads in Communications is an excellent reference for researchers and professionals.
This book of proceedings includes papers presenting the state of art in electrical engineering and control theory as well as their applications. The topics focus on classical as well as modern methods for modeling, control, identification and simulation of complex systems with applications in science and engineering. The papers were selected from the hottest topic areas, such as control and systems engineering, renewable energy, faults diagnosis-faults tolerant control, large-scale systems, fractional order systems, unconventional algorithms in control engineering, signals and communications. The control and design of complex systems dynamics, analysis and modeling of its behavior and structure is vitally important in engineering, economics and in science generally science today. Examples of such systems can be seen in the world around us and are a part of our everyday life. Application of modern methods for control, electronics, signal processing and more can be found in our mobile phones, car engines, home devices like washing machines is as well as in such advanced devices as space probes and systems for communicating with them. All these technologies are part of technological backbone of our civilization, making further research and hi-tech applications essential. The rich variety of contributions appeals to a wide audience, including researchers, students and academics.
Over the last few years, Web technology has grown so rapidly that
it is hard for interested readers to learn and keep up with the
techniques. It would be extremely useful to have a single book that
collectively describes not only the underlying areas from which
internet technology derives its solutions, but also details the
specific solutions to important applications on the World Wide Web.
Foundations of Web Technology covers the basics of Web technology
while being specialized enough to add value to experienced
professionals working in this field. Most books on the Web focus on
programmatic aspects of languages such as Java, JavaScript, or
description of standards such as Hypertext Markup Language (HTML)
or Wireless Markup Language (WML). A book that covers the concepts
behind the infrastructure of the Web would be indispensable to a
wide range of audiences interested in learning how the Web works,
how techniques in Web technology can be applied to their own
problem, and what the emergent technological trends in these areas
are.
Broadband Fixed Wireless Access provides a systematic overview of the emerging WiMAX technology, and much of the material is based on the practical experiences of the authors in building new systems. This material will be of significant interest to network architects and developers of broadband fixed wireless access products. With the advent of the IEEE 802.16 standard (has the standard or any part of it been adopted or is likely to be adopted soon) and next generation equipment, this technology has been growing in interest. The authors discuss applications at microwave frequencies between 2 and 11 GHz that could be attractive options for operators without an existing access infrastructure for reaching end users. This introductory volume demystifies the technology and provides technical exposure to the various system trade-offs. Additionally, the book features the following highlights: detailed modeling of broadband fixed wireless access propagation channel, including new measurements for its time variation; an extensive overview of the IEEE 802.1 discussion of the suitability of various multi-antenna techniques; and elaboration of various techniques, i.e. autodirecting antennas, bridging with WLAN, and multi-hop networking, that can be used to reduce the cost of ownership of a WiMAX network for an operator. The authors cover a wide range of topics, from network deployment to implementation of terminals. Wireless professionals will gain a head start from the information on WiMAX technology. This is a must read book when starting with broadband fixed wireless access.
The objective of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012) is to facilitate an exchange of information on best practices for the latest research advances in the area of communications, networks and intelligence applications. These mainly involve computer science and engineering, informatics, communications and control, electrical engineering, information computing, and business intelligence and management. "Proceedings of the 2nd International Conference on Green Communications and Networks 2012 (GCN 2012)" will focus on green information technology and applications, which will provide in-depth insights for engineers and scientists in academia, industry, and government. The book addresses the most innovative research developments including technical challenges, social and economic issues, and presents and discusses the authors' ideas, experiences, findings, and current projects on all aspects of advanced green information technology and applications. Yuhang Yang is a professor at the Department of Electronic Engineering, Shanghai Jiao Tong University. Maode Ma is an associate professor at the School of Electrical & Electronic Engineering, Nanyang Technological University.
One aspect of the new economy is a transition to a networked society, and the emergence of a highly interconnected, interdependent and complex system of networks to move people, goods and information. An example of this is the in creasing reliance of networked systems (e. g. , air transportation networks, electric power grid, maritime transport, etc. ) on telecommunications and information in frastructure. Many of the networks that evolved today have an added complexity in that they have both a spatial structure - i. e. , they are located in physical space but also an a spatial dimension brought on largely by their dependence on infor mation technology. They are also often just one component of a larger system of geographically integrated and overlapping networks operating at different spatial levels. An understanding of these complexities is imperative for the design of plans and policies that can be used to optimize the efficiency, performance and safety of transportation, telecommunications and other networked systems. In one sense, technological advances along with economic forces that encourage the clustering of activities in space to reduce transaction costs have led to more efficient network structures. At the same time the very properties that make these networks more ef ficient have also put them at a greater risk for becoming disconnected or signifi cantly disruptedwh en super connected nodes are removed either intentionally or through a targeted attack.
The broad scope of Cloud Computing is creating a technology, business, sociolo- cal, and economic renaissance. It delivers the promise of making services available quickly with rather little effort. Cloud Computing allows almost anyone, anywhere, at anytime to interact with these service offerings. Cloud Computing creates a unique opportunity for its users that allows anyone with an idea to have a chance to deliver it to a mass market base. As Cloud Computing continues to evolve and penetrate different industries, it is inevitable that the scope and definition of Cloud Computing becomes very subjective, based on providers' and customers' persp- tive of applications. For instance, Information Technology (IT) professionals p- ceive a Cloud as an unlimited, on-demand, flexible computing fabric that is always available to support their needs. Cloud users experience Cloud services as virtual, off-premise applications provided by Cloud service providers. To an end user, a p- vider offering a set of services or applications in the Cloud can manage these off- ings remotely. Despite these discrepancies, there is a general consensus that Cloud Computing includes technology that uses the Internet and collaborated servers to integrate data, applications, and computing resources. With proper Cloud access, such technology allows consumers and businesses to access their personal files on any computer without having to install special tools. Cloud Computing facilitates efficient operations and management of comp- ing technologies by federating storage, memory, processing, and bandwidth.
Intended for undergraduate students of electrical engineering, this introduction to electromagnetic fields emphasizes the computation of fields as well as the development of theoretical relations. The first part thus presents the electromagnetic field and Maxwell's equations with a view toward connecting the disparate applications to the underlying relations, while the second part presents computational methods of solving the equations - which for most practical calses cannot be solved analytically.
The 3rd International Conference on Foundations and Frontiers in Computer, Communication and Electrical Engineering is a notable event which brings together academia, researchers, engineers and students in the fields of Electronics and Communication, Computer and Electrical Engineering making the conference a perfect platform to share experience, foster collaborations across industry and academia, and evaluate emerging technologies across the globe. The conference is technically co-sponsored by IEEE Kolkata Section along with several IEEE chapters, Kolkata Section such as Electron Devices Society, Power and Energy Society, Dielectrics and Electrical Insulation Society, Computer Society, and in association with CSIR-CEERI, Pilani, Rajasthan. The scope of the conference covers some broad areas of interest (but not limited to) such as Satellite and Mobile Communication Systems, Radar, Antennas, High Power Microwave Systems (HPMS), Electronic Warfare, Information Warfare, UWB systems, Microwave and Optical Communications, Microwave and Millimetre-Wave Tubes, Photonics, Plasma Devices, Missile Tracking and Guided systems, High voltage engineering, Electrical Machines, Power Systems, Control Systems, Non-Conventional Energy, Power Electronics and Drives, Machine Learning and Artificial Intelligence, Networking, Image Processing, Soft Computing, Cloud Computing, Data Mining & Data warehousing, etc.
The process control industry has seen generations of technology advancement, from pneumatic communication to electrical communication to electronic c- munication, from centralized control to distributed control. At the center of today's distributed control systems are operator workstations. These operator wo- stations provide the connection between those overseeing and running plant operations to the process itself. With each new generation of products the operator workstation has become increasingly more intelligent. Newer applications provide advanced alarming, control, and diagnostics. Behind all of these applications are smarter devices. These smart devices provide greater process insight, reduce en- neering costs, and contribute to improving the overall operational performance of the plant. Smart devices include advanced diagnostics that can report the health of the device and in many cases, the health of the process that the device is connected to. It is not uncommon for smart devices to include diagnostics that can detect plugged lines, burner flame instability, agitator loss, wet gas, orifice wear, leaks, and cavitations. These devices tell the user how well they are operating and when they need maintenance. Improvements in sensor technology and diagnostics have lead to a large variety of smart devices. So how do users connect the capabilities of these smart devices to their existing control system infrastructures? The answer is wireless. Wireless technology has matured to the point that it now can be safely applied in industrial control, monitor, and asset management applications.
This book provides an overview of positioning technologies, applications and services in a format accessible to a wide variety of readers. Readers who have always wanted to understand how satellite-based positioning, wireless network positioning, inertial navigation, and their combinations work will find great value in this book. Readers will also learn about the advantages and disadvantages of different positioning methods, their limitations and challenges. Cognitive positioning, adding the brain to determine which technologies to use at device runtime, is introduced as well. Coverage also includes the use of position information for Location Based Services (LBS), as well as context-aware positioning services, designed for better user experience.
This book presents a number of research efforts in combining AI methods or techniques to solve complex problems in various areas. The combination of different intelligent methods is an active research area in artificial intelligence (AI), since it is believed that complex problems can be more easily solved with integrated or hybrid methods, such as combinations of different soft computing methods (fuzzy logic, neural networks, and evolutionary algorithms) among themselves or with hard AI technologies like logic and rules; machine learning with soft computing and classical AI methods; and agent-based approaches with logic and non-symbolic approaches. Some of the combinations are already extensively used, including neuro-symbolic methods, neuro-fuzzy methods, and methods combining rule-based and case-based reasoning. However, other combinations are still being investigated, such as those related to the semantic web, deep learning and swarm intelligence algorithms. Most are connected with specific applications, while the rest are based on principles.
Fundamentals of Codes, Graphs, and Iterative Decoding is an
explanation of how to introduce local connectivity, and how to
exploit simple structural descriptions. Chapter 1 provides an
overview of Shannon theory and the basic tools of complexity
theory, communication theory, and bounds on code construction.
Chapters 2 - 4 provide an overview of "classical" error control
coding, with an introduction to abstract algebra, and block and
convolutional codes. Chapters 5 - 9 then proceed to systematically
develop the key research results of the 1990s and early 2000s with
an introduction to graph theory, followed by chapters on algorithms
on graphs, turbo error control, low density parity check codes, and
low density generator codes.
An all-encompassing guide to the business, engineering, and regulatory factors shaping the growth of the distance learning industry. This book examines potential providers, users, applications, and problem solutions, and includes actual case studies. An outstanding reference for educators, network service providers, public policy makers, and graduate level engineering students specializing in telecommunications.
With the rapid growth of bandwidth demand from network users and the advances in optical technologies, optical networks with multiterabits per-second capacity has received significant interest from both researchers and practitioners. Optical networks deployment raises a number of challenging problems that require innovative solutions, including net work architectures, scalable and fast network management, resource efficient routing and wavelength assignment algorithms, QoS support and scheduling algorithms, and switch and router architectures. In this book, we put together some important developments in this exiting area during last several years. Some of the articles are research papers and some are surveys. All articles were reviewed by two reviewers. The paper, "On Dynamic Wavelength Assignment in WDM Optical Networks," by Alanyali gives an overview of some issues in the analy sis and synthesis of dynamic wavelength assignment policies for optical WDM networks and illustrates a new method of analysis. The paper by Ellinas and Bala, "Wavelength Assignment Algorithms for WDM Ring Architectures," presents two optimal wavelength assignment algorithms that assign the minimum number of wavelengths between nodes on WDM rings to achieve full mesh connectivity. In the paper, "Optimal Placement of Wavelength Converters in WDM Networks for Parallel and Distributed Computing Systems," Jia et al."
Essential background reading for engineers and scientists working in such fields as communications, control, signal, and image processing, radar and sonar, radio astronomy, seismology, remote sensing, and instrumentation. The book can be used as a textbook for a single course, as well as a combination of an introductory and an advanced course, or even for two separate courses, one in signal detection, the other in estimation.
The need for automatic speech recognition systems to be robust with respect to changes in their acoustical environment has become more widely appreciated in recent years, as more systems are finding their way into practical applications. Although the issue of environmental robustness has received only a small fraction of the attention devoted to speaker independence, even speech recognition systems that are designed to be speaker independent frequently perform very poorly when they are tested using a different type of microphone or acoustical environment from the one with which they were trained. The use of microphones other than a "close talking" headset also tends to severely degrade speech recognition -performance. Even in relatively quiet office environments, speech is degraded by additive noise from fans, slamming doors, and other conversations, as well as by the effects of unknown linear filtering arising reverberation from surface reflections in a room, or spectral shaping by microphones or the vocal tracts of individual speakers. Speech-recognition systems designed for long-distance telephone lines, or applications deployed in more adverse acoustical environments such as motor vehicles, factory floors, oroutdoors demand far greaterdegrees ofenvironmental robustness. There are several different ways of building acoustical robustness into speech recognition systems. Arrays of microphones can be used to develop a directionally-sensitive system that resists intelference from competing talkers and other noise sources that are spatially separated from the source of the desired speech signal."
It gives me immense pleasure to introduce this timely handbook to the research/- velopment communities in the ?eld of signal processing systems (SPS). This is the ?rst of its kind and represents state-of-the-arts coverage of research in this ?eld. The driving force behind information technologies (IT) hinges critically upon the major advances in both component integration and system integration. The major breakthrough for the former is undoubtedly the invention of IC in the 50's by Jack S. Kilby, the Nobel Prize Laureate in Physics 2000. In an integrated circuit, all components were made of the same semiconductor material. Beginning with the pocket calculator in 1964, there have been many increasingly complex applications followed. In fact, processing gates and memory storage on a chip have since then grown at an exponential rate, following Moore's Law. (Moore himself admitted that Moore's Law had turned out to be more accurate, longer lasting and deeper in impact than he ever imagined. ) With greater device integration, various signal processing systems have been realized for many killer IT applications. Further breakthroughs in computer sciences and Internet technologies have also catalyzed large-scale system integration. All these have led to today's IT revolution which has profound impacts on our lifestyle and overall prospect of humanity. (It is hard to imagine life today without mobiles or Internets ) The success of SPS requires a well-concerted integrated approach from mul- ple disciplines, such as device, design, and application.
The last ten years have seen a great flowering of the theory of digital data modulation. This book is a treatise on digital modulation theory, with an emphasis on these more recent innovations. It has its origins in a collabor ation among the authors that began in 1977. At that time it seemed odd to us that the subjects of error-correcting codes and data modulation were so separated; it seemed also that not enough understanding underlay the mostly ad hoc approaches to data transmission. A great many others were intrigued, too, and the result was a large body of new work that makes up most of this book. Now the older disciplines of detection theory and coding theory have been generalized and applied to the point where it is hard to tell where these end and the theories of signal design and modulation begin. Despite our emphasis on the events of the last ten years, we have included all the traditional topics of digital phase modulation. Signal space concepts are developed, as are simple phase-shift-keyed and pulse-shaped modulations; receiver structures are discussed, from the simple linear receiver to the Viterbi algorithm; the effects of channel filtering and of hardlimiting are described. The volume thus serves well as a pedagogical book for research engineers in industry and second-year graduate students in communications engineering. The production of a manageable book required that many topics be left out."
Faithful communication is a necessary precondition for large-scale quantum information processing and networking, irrespective of the physical platform. Thus, the problems of quantum-state transfer and quantum-network engineering have attracted enormous interest over the last years, and constitute one of the most active areas of research in quantum information processing. The present volume introduces the reader to fundamental concepts and various aspects of this exciting research area, including links to other related areas and problems. The implementation of state-transfer schemes and the engineering of quantum networks are discussed in the framework of various quantum optical and condensed matter systems, emphasizing the interdisciplinary character of the research area. Each chapter is a review of theoretical or experimental achievements on a particular topic, written by leading scientists in the field. The volume aims at both newcomers as well as experienced researchers.
Since the first edition of this book was published seven years ago, the field of modeling and simulation of communication systems has grown and matured in many ways, and the use of simulation as a day-to-day tool is now even more common practice. With the current interest in digital mobile communications, a primary area of application of modeling and simulation is now in wireless systems of a different flavor from the traditional' ones. This second edition represents a substantial revision of the first, partly to accommodate the new applications that have arisen. New chapters include material on modeling and simulation of nonlinear systems, with a complementary section on related measurement techniques, channel modeling and three new case studies; a consolidated set of problems is provided at the end of the book.
Digital Baseband Transmission and Recording provides an integral, in-depth and up-to-date overview of the signal processing techniques that are at the heart of digital baseband transmission and recording systems. The coverage ranges from fundamentals to applications in such areas as digital subscriber loops and magnetic and optical storage. Much of the material presented here has never before appeared in book form. The main features of Digital Baseband Transmission and Recording include: a survey of digital subscriber lines and digital magnetic and optical storage; a review of fundamental transmission and reception limits; an encyclopedic introduction to baseband modulation codes; development of a rich palette of equalization techniques; a coherent treatment of Viterbi detection and many near-optimum detection schemes; an overview of adaptive reception techniques that encompasses adaptive gain and slope control, adaptive detection, and novel forms of zero-forcing adaptation; an in-depth review of timing recovery and PLLs, with an extensive catalog of timing-recovery schemes. . Featuring around 450 figures, 200 examples, 350 problems and exercises, and 750 references, Digital Baseband Transmission and Recording is an essential reference source to engineers and researchers active in telecommunications and digital recording. It will also be useful for advanced courses in digital communications.
The book describes a method for modeling systems architecture, particularly of telecom networks and systems, although a large part can be used in a wider context. The method is called Sysnet Modeling and is based on a new modeling language, AML (Abstract systems Modeling Language), which is also described in the book. By applying Sysnet Modeling and AML, a formal model of the system is created. That model can be used for systems analysis as well as for communicating system knowledge to a broader audience of engineers in development projects. Inherent in sysnet modeling is the potential for considerable reduction in time spent on system implementation through the possibilities for code- and test-case generation. |
You may like...
A Modern Guide to the Digitalization of…
Juan Montero, Matthias Finger
Hardcover
R3,939
Discovery Miles 39 390
Role of 6g Wireless Networks in AI and…
Malaya Dutta Borah, Steven A. Wright, …
Hardcover
R6,527
Discovery Miles 65 270
Radio For The Amateur - The Underlying…
Alfred Herbert 1889- Packer
Hardcover
R825
Discovery Miles 8 250
|