![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer communications & networking
This book presents a comprehensive overview of wireless sensor networks (WSNs) with an emphasis on security, coverage, and localization. It offers a structural treatment of WSN building blocks including hardware and protocol architectures and also provides a systems-level view of how WSNs operate. These building blocks will allow readers to program specialized applications and conduct research in advanced topics. A brief introductory chapter covers common applications and communication protocols for WSNs. Next, the authors review basic mathematical models such as Voroni diagrams and Delaunay triangulations. Sensor principles, hardware structure, and medium access protocols are examined. Security challenges ranging from defense strategies to network robustness are explored, along with quality of service measures. Finally, this book discusses recent developments and future directions in WSN platforms. Each chapter concludes with classroom-tested exercises that reinforce key concepts. This book is suitable for researchers and for practitioners in industry. Advanced-level students in electrical engineering and computer science will also find the content helpful as a textbook or reference.
The book presents a coherent description of distributed manufacturing, providing a solid base for further research on the subject as well as smart implementations in companies. It provides a guide for those researching and working in a range of fields, such as smart manufacturing, cloud computing, RFID tracking, distributed automation, cyber physical production and global design anywhere, manufacture anywhere solutions. Foundations & Principles of Distributed Manufacturing anticipates future advances in the fields of embedded systems, the Internet of Things and cyber physical systems, outlining how adopting these innovations could rapidly bring about improvements in key performance indicators, which could in turn generate competition pressure by rendering successful business models obsolete. In laying the groundwork for powerful theoretical models, high standards for the homogeneity and soundness of the suggested setups are applied. The book especially elaborates on the upcoming competition in online manufacturing operations and respective control procedures. By outlining encapsulation and evolving decision-making principles, Foundations & Principles of Distributed Manufacturing fully conceptualizes the view of manufacturing networks as sets of loosely coupled interacting smart factory objects. Moreover, the book provides concrete approaches to a number of future fields, where distributed manufacturing might be applied. Both researchers and professionals will profit from the authors' broad experience in Distributed Manufacturing and Fractal Enterprise implementations, where they initiated and completed a number of successful research projects: within the global Intelligent Manufacturing Systems (IMS) scheme, within the European Research Area frameworks as well as national contexts, and both in industry and at leading research institutions. This background ensures well-founded theory on one hand and valuable practical results on the other in a fascinating area that is still under intensive research. Readers will acquire essential insights as well as useful guidance for categorizing and specifying extended distributed manufacturing solutions and their professional implementations.
This book discusses a variety of methods for outlier ensembles and organizes them by the specific principles with which accuracy improvements are achieved. In addition, it covers the techniques with which such methods can be made more effective. A formal classification of these methods is provided, and the circumstances in which they work well are examined. The authors cover how outlier ensembles relate (both theoretically and practically) to the ensemble techniques used commonly for other data mining problems like classification. The similarities and (subtle) differences in the ensemble techniques for the classification and outlier detection problems are explored. These subtle differences do impact the design of ensemble algorithms for the latter problem. This book can be used for courses in data mining and related curricula. Many illustrative examples and exercises are provided in order to facilitate classroom teaching. A familiarity is assumed to the outlier detection problem and also to generic problem of ensemble analysis in classification. This is because many of the ensemble methods discussed in this book are adaptations from their counterparts in the classification domain. Some techniques explained in this book, such as wagging, randomized feature weighting, and geometric subsampling, provide new insights that are not available elsewhere. Also included is an analysis of the performance of various types of base detectors and their relative effectiveness. The book is valuable for researchers and practitioners for leveraging ensemble methods into optimal algorithmic design.
This book combines the three dimensions of technology, society and economy to explore the advent of today's cloud ecosystems as successors to older service ecosystems based on networks. Further, it describes the shifting of services to the cloud as a long-term trend that is still progressing rapidly.The book adopts a comprehensive perspective on the key success factors for the technology - compelling business models and ecosystems including private, public and national organizations. The authors explore the evolution of service ecosystems, describe the similarities and differences, and analyze the way they have created and changed industries. Lastly, based on the current status of cloud computing and related technologies like virtualization, the internet of things, fog computing, big data and analytics, cognitive computing and blockchain, the authors provide a revealing outlook on the possibilities of future technologies, the future of the internet, and the potential impacts on business and society.
This book, edited by four of the leaders of the National Science Foundation's Global Environment and Network Innovations (GENI) project, gives the reader a tour of the history, architecture, future, and applications of GENI. Built over the past decade by hundreds of leading computer scientists and engineers, GENI is a nationwide network used daily by thousands of computer scientists to explore the next Cloud and Internet and the applications and services they enable, which will transform our communities and our lives. Since by design it runs on existing computing and networking equipment and over the standard commodity Internet, it is poised for explosive growth and transformational impact over the next five years. Over 70 of the builders of GENI have contributed to present its development, architecture, and implementation, both as a standalone US project and as a federated peer with similar projects worldwide, forming the core of a worldwide network. Applications and services enabled by GENI, from smarter cities to intensive collaboration to immersive education, are discussed. The book also explores the concepts and technologies that transform the Internet from a shared transport network to a collection of "slices" -- private, on-the-fly application-specific nationwide networks with guarantees of privacy and responsiveness. The reader will learn the motivation for building GENI and the experience of its precursor infrastructures, the architecture and implementation of the GENI infrastructure, its deployment across the United States and worldwide, the new network applications and services enabled by and running on the GENI infrastructure, and its international collaborations and extensions. This book is useful for academics in the networking and distributed systems areas, Chief Information Officers in the academic, private, and government sectors, and network and information architects.
In recent decades there has been incredible growth in the use of various internet applications by individuals and organizations who store sensitive information online on different servers. This greater reliance of organizations and individuals on internet technologies and applications increases the threat space and poses several challenges for implementing and maintaining cybersecurity practices. Constructing an Ethical Hacking Knowledge Base for Threat Awareness and Prevention provides innovative insights into how an ethical hacking knowledge base can be used for testing and improving the network and system security posture of an organization. It is critical for each individual and institute to learn hacking tools and techniques that are used by dangerous hackers in tandem with forming a team of ethical hacking professionals to test their systems effectively. Highlighting topics including cyber operations, server security, and network statistics, this publication is designed for technical experts, students, academicians, government officials, and industry professionals.
In December 1974 the first realtime conversation on the ARPAnet took place between Culler-Harrison Incorporated in Goleta, California, and MIT Lincoln Laboratory in Lexington, Massachusetts. This was the first successful application of realtime digital speech communication over a packet network and an early milestone in the explosion of realtime signal processing of speech, audio, images, and video that we all take for granted today. It could be considered as the first voice over Internet Protocol (VoIP), except that the Internet Protocol (IP) had not yet been established. In fact, the interest in realtime signal processing had an indirect, but major, impact on the development of IP. This is the story of the development of linear predictive coded (LPC) speech and how it came to be used in the first successful packet speech experiments. Several related stories are recounted as well. The history is preceded by a tutorial on linear prediction methods which incorporates a variety of views to provide context for the stories. This part is a technical survey of the fundamental ideas of linear prediction that are important for speech processing, but the development departs from traditional treatments and takes advantage of several shortcuts, simplifications, and unifications that come with years of hindsight. In particular, some of the key results are proved using short and simple techniques that are not as well known as they should be, and it also addresses some of the common assumptions made when modeling random signals. Linear Predictive Coding and the Internet Protocol is an insightful and comprehensive review of an underpinning technology of the internet and other packet switched networks. It will be enjoyed by everyone with an interest in past and present real time signal processing on the internet.
This book develops, evaluates and refines a cloud service relationship theory that explains how cloud users' uncertainties arise in these relationships and how they can be mitigated. To that end, the book employs principal-agent theory and the concepts of bounded rationality and social embeddedness. Beyond advancing IS research, the findings presented can greatly benefit governments, IT departments and IT providers, helping them to better understand cloud service relationships and to adjust their cloud service strategies accordingly.
This book presents a mathematical treatment of the radio resource allocation of modern cellular communications systems in contested environments. It focuses on fulfilling the quality of service requirements of the living applications on the user devices, which leverage the cellular system, and with attention to elevating the users' quality of experience. The authors also address the congestion of the spectrum by allowing sharing with the band incumbents while providing with a quality-of-service-minded resource allocation in the network. The content is of particular interest to telecommunications scheduler experts in industry, communications applications academia, and graduate students whose paramount research deals with resource allocation and quality of service.
Is meaningful communication possible between two intelligent parties who share no common language or background? In this work, a theoretical framework is proposed in which it is possible to address when and to what extent such semantic communication is possible: such problems can be rigorously addressed by explicitly focusing on the goals of the communication. Under this framework, it is possible to show that for many goals, communication without any common language or background is possible using universal protocols. This work should be accessible to anyone with an undergraduate-level knowledge of the theory of computation. The theoretical framework presented here is of interest to anyone wishing to design systems with flexible interfaces, either among computers or between computers and their users.
This monograph describes and implements partially homomorphic encryption functions using a unified notation. After introducing the appropriate mathematical background, the authors offer a systematic examination of the following known algorithms: Rivest-Shamir-Adleman; Goldwasser-Micali; ElGamal; Benaloh; Naccache-Stern; Okamoto-Uchiyama; Paillier; Damgaard-Jurik; Boneh-Goh-Nissim; and Sander-Young-Yung. Over recent years partially and fully homomorphic encryption algorithms have been proposed and researchers have addressed issues related to their formulation, arithmetic, efficiency and security. Formidable efficiency barriers remain, but we now have a variety of algorithms that can be applied to various private computation problems in healthcare, finance and national security, and studying these functions may help us to understand the difficulties ahead. The book is valuable for researchers and graduate students in Computer Science, Engineering, and Mathematics who are engaged with Cryptology.
Synthetic Worlds, Virtual Worlds, and Alternate Realities are all terms used to describe the phenomenon of computer-based, simulated environments in which users inhabit and interact via avatars. The best-known commercial applications are in the form of electronic gaming, and particularly in massively-multiplayer online role-playing games like World of Warcraft or Second Life. Less known, but possibly more important, is the rapid adoption of platforms in education and business, where Serious Games are being used for training purposes, and even Second Life is being used in many situations that formerly required travel. The editors of this book captures the state of research in the field intended to reflect the rapidly growing yet relatively young market in education and business. The general focus is set on the scientific community but integrates the practical applications for businesses, with papers on information systems, business models, and economics. In six parts, international authors - all experts in their field - discuss the current state-of-the-art of virtual worlds/alternate realities and how the field will develop over the next years. Chapters discuss the influences and impacts in and around virtual worlds. Part four is about education, with a focus on learning environments and experiences, pedagogical models, and the effects on the different roles in the educational sector. The book looks at business models and how companies can participate in virtual worlds while receiving a return on investment, and includes cases and scenarios of integration, from design, implementation to application.
This book discusses the smooth integration of optical and RF networks in 5G and beyond (5G+) heterogeneous networks (HetNets), covering both planning and operational aspects. The integration of high-frequency air interfaces into 5G+ wireless networks can relieve the congested radio frequency (RF) bands. Visible light communication (VLC) is now emerging as a promising candidate for future generations of HetNets. Heterogeneous RF-optical networks combine the high throughput of visible light and the high reliability of RF. However, when implementing these HetNets in mobile scenarios, several challenges arise from both planning and operational perspectives. Since the mmWave, terahertz, and visible light bands share similar wave propagation characteristics, the concepts presented here can be broadly applied in all such bands. To facilitate the planning of RF-optical HetNets, the authors present an algorithm that specifies the joint optimal densities of the base stations by drawing on stochastic geometry in order to satisfy the users' quality-of-service (QoS) demands with minimum network power consumption. From an operational perspective, the book explores vertical handovers and multi-homing using a cooperative framework. For vertical handovers, it employs a data-driven approach based on deep neural networks to predict abrupt optical outages; and, on the basis of this prediction, proposes a reinforcement learning strategy that ensures minimal network latency during handovers. In terms of multi-homing support, the authors examine the aggregation of the resources from both optical and RF networks, adopting a two-timescale multi-agent reinforcement learning strategy for optimal power allocation. Presenting comprehensive planning and operational strategies, the book allows readers to gain an in-depth grasp of how to integrate future coexisting networks at high-frequency bands in a cooperative manner, yielding reliable and high-speed 5G+ HetNets.
The reference bookreviews and presentssystematically the use of Internet in administration and politics. A process-oriented layer model defines the options of exchange and participation forall claim groups covering these topics: eAssistance, eProcurement, eService, eContracting, eSettlement, eCollaboration, eDemocracy, and eCommunity.Case studies show practical applications in industry, administration and research. The book is well suited for students in Business, Economicsand Political Sciences courses as well as for practitionersinterested in the opportunities of digital exchange and participation in the knowledge society. "
This book addresses the fundamental theory and key technologies of narrowband and broadband mobile communication systems specifically for railways. It describes novel relaying schemes that meet the different design criteria for railways and discusses the applications of signal classification techniques as well as offline resource scheduling as a way of advancing rail practice. Further, it introduces Novel Long Term Evolution for Railway (LTE-R) network architecture, the Quality of Service (QoS) requirement of LTE-R and its performance evaluation and discusses in detail security technologies for rail-dedicated mobile communication systems. The advanced research findings presented in the book are all based on high-speed railway measurement data, which offer insights into the propagation mechanisms and corresponding modeling theory and approaches in unique railway scenarios.It is a valuable resource for researchers, engineers and graduate students in the fields of rail traffic systems, telecommunication and information systems.
These proceedings present the latest information on software reliability, industrial safety, cyber security, physical protection, testing and verification for nuclear power plants. The papers were selected from more than 80 submissions and presented at the First International Symposium on Software Reliability, Industrial Safety, Cyber Security and Physical Protection for Nuclear Power Plants, held in Yinchuan, China on May 30 - June 1, 2016. The primary aim of this symposium was to provide a platform to facilitate the discussion for comprehension, application and management of digital instrumentation, control systems and technologies in nuclear power plants. The book reflects not only the state of the art and latest trends in nuclear instrumentation and control system technologies, but also China's increasing influence in this area. It is a valuable resource for both practitioners and academics working in the field of nuclear instrumentation, control systems and other safety-critical systems, as well as nuclear power plant managers, public officials and regulatory authorities.
This book reveals the historical context and the evolution of the technically complex Allied Signals Intelligence (Sigint) activity against Japan from 1920 to 1945. It traces the all-important genesis and development of the cryptanalytic techniques used to break the main Japanese Navy code (JN-25) and the Japanese Army s Water Transport Code during WWII. This is the first book to describe, explain and analyze the code breaking techniques developed and used to provide this intelligence, thus closing the sole remaining gap in the published accounts of the Pacific War. The authors also explore the organization of cryptographic teams and issues of security, censorship, and leaks. Correcting gaps in previous research, this book illustrates how Sigint remained crucial to Allied planning throughout the war. It helped direct the advance to the Philippines from New Guinea, the sea battles and the submarine onslaught on merchant shipping. Written by well-known authorities on the history of cryptography and mathematics, Code Breaking in the Pacific is designed for cryptologists, mathematicians and researchers working in communications security. Advanced-level students interested in cryptology, the history of the Pacific War, mathematics or the history of computing will also find this book a valuable resource."
This textbook offers a technical, architectural, and management
approach to solving the problems of protecting national
infrastructure and includes practical and empirically-based
guidance for students wishing to become security engineers, network
operators, software designers, technology managers, application
developers, Chief Security Officers, etc.. This approach includes
controversial themes such as the deliberate use of deception to
trap intruders. In short, it serves as an attractive framework for
a new national strategy for cyber security. Each principle is
presented as a separate security strategy, along with pages of
compelling examples that demonstrate use of the principle. A
specific set of criteria requirements allows students to understand
how any organization, such as a government agency, integrates the
principles into their local environment. The STUDENT EDITION
features several case studies illustrating actual implementation
scenarios of the principals and requirements discussed in the text.
It also includes helpful pedagogical elements such as chapter
outlines, chapter summaries, learning checklists, and a 2-color
interior. And it boasts a new and complete instructor ancillary
package including test bank, IM, Ppt slides, case study questions,
and more. Provides case studies focusing on cyber security challenges and solutions to display how theory, research, and methods, apply to real-life challenges Utilizes, end-of-chapter case problems that take chapter content and relate it to real security situations and issues Includes instructor slides for each chapter as well as an instructor s manual with sample syllabi and test bank"
Introduction: Background and Status. Design before Evaluation. Prerequisite Knowledge Areas: Supportive Tools and Techniques. Interface Structures. Basic Measures. Measurement and Evaluation: Evaluation Terms and Aspects. Tailored Measures of Performance. Evaluation Approaches and Methods. Special Topics: Stress and User Satisfaction. Visualizable Objects and Spaces. Interaction and Mental Involvement. Structural Specification and Utility. Index.
This book explores the concept of a map as a fundamental data type. It defines maps at three levels. The first is an abstract level, in which mathematic concepts are leveraged to precisely explain maps and operational semantics. The second is at a discrete level, in which graph theory is used to create a data model with the goal of implementation in computer systems. Finally, maps are examined at an implementation level, in which the authors discuss the implementation of a fundamental map data type in database systems. The map data type presented in this book creates new mechanisms for the storage, analysis, and computation of map data objects in any field that represents data in a map form. The authors develop a model that includes a map data type capable of representing thematic and geometric attributes in a single data object. The book provides a complete example of mathematically defining a data type, ensuring closure properties of those operations, and then translating that type into a state that is suited for implementation in a particular context. The book is designed for researchers and professionals working in geography or computer science in a range of fields including navigation, reasoning, robotics, geospatial analysis, data management, and information retrieval.
The book compiles technologies for enhancing and provisioning
security, privacy and trust in cloud systems based on Quality of
Service requirements. It is a timely contribution to a field that
is gaining considerable research interest, momentum, and provides a
comprehensive coverage of technologies related to cloud security,
privacy and trust. In particular, the book includes
The worldwide reach of the Internet allows malicious cyber
criminals to coordinate and launch attacks on both cyber and
cyber-physical infrastructure from anywhere in the world. This
purpose of this handbook is to introduce the theoretical
foundations and practical solution techniques for securing critical
cyber and physical infrastructures as well as their underlying
computing and communication architectures and systems. Examples of
such infrastructures include utility networks (e.g., electrical
power grids), ground transportation systems (automotives, roads,
bridges and tunnels), airports and air traffic control systems,
wired and wireless communication and sensor networks, systems for
storing and distributing water and food supplies, medical and
healthcare delivery systems, as well as financial, banking and
commercial transaction assets. The handbook focus mostly on the
scientific foundations and engineering techniques - while also
addressing the proper integration of policies and access control
mechanisms, for example, how human-developed policies can be
properly enforced by an automated system. *Addresses the technical challenges facing design of secure infrastructures by providing examples of problems and solutions from a wide variety of internal and external attack scenarios *Includes contributions from leading researchers and practitioners in relevant application areas such as smart power grid, intelligent transportation systems, healthcare industry and so on. *Loaded with examples of real world problems and pathways to solutions utilizing specific tools and techniques described in detail throughout
With the popularity of the Wireless Local Area Network (WLAN) standard 802.11 WiFi and the growing interest in the next generation Wireless Metropolitan Area Network (WMAN) standard 802.16 WiMax, the need for effective solutions to the inherent security weaknesses of these networking technologies has become of critical importance. Thoroughly explaining the risks associated with deploying WLAN and WMAN networks, this groundbreaking book offers professionals practical insight into identifying and overcoming these security issues. Including detailed descriptions of possible solutions to a number of specific security problems, the book gives practitioners the hands-on techniques that they need to secure wireless networks in the enterprise and the home. |
![]() ![]() You may like...
Network+ Guide to Networks
Jill West, Jean Andrews, …
Paperback
Practical Industrial Data Communications…
Deon Reynders, Steve Mackay, …
Paperback
R1,539
Discovery Miles 15 390
The Host in the Machine - Examining the…
Angela Thomas-Jones
Paperback
R1,397
Discovery Miles 13 970
Global Perspectives on Information…
Guillermo A. Francia Iii, Jeffrey S. Zanzig
Hardcover
R6,432
Discovery Miles 64 320
CCNA 200-301 Network Simulator
Sean Wilkins
Digital product license key
R3,056
Discovery Miles 30 560
|