Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
|||
Books > Computing & IT > Computer communications & networking
Computer Networks & Communications (NetCom) is the proceedings from the Fourth International Conference on Networks & Communications. This book covers theory, methodology and applications of computer networks, network protocols and wireless networks, data communication technologies, and network security. The proceedings will feature peer-reviewed papers that illustrate research results, projects, surveys and industrial experiences that describe significant advances in the diverse areas of computer networks & communications.
This book presents the outcomes of the Third National Conference on Communication, Cloud and Big Data (CCB) held on November 2-3, 2018, at Sikkim Manipal Institute of Technology, Majitar, Sikkim. Featuring a number of papers from the conference, it explores various aspects of communication, computation, cloud, and big data, including routing in cognitive radio wireless sensor networks, big data security issues, routing in ad hoc networks, routing protocol for Internet of things (IoT), and algorithm for imaging quality enhancement.
Since databases are the primary repositories of information for today's organizations and governments, database security has become critically important. Introducing the concept of multilevel security in relational databases, this book provides a comparative study of the various models that support multilevel security policies in the relational database-illustrating the strengths and weaknesses of each model. Multilevel Security for Relational Databases covers multilevel database security concepts along with many other multilevel database security models and techniques. It presents a prototype that readers can implement as a tool for conducting performance evaluations to compare multilevel secure database models. The book supplies a complete view of an encryption-based multilevel security database model that integrates multilevel security for the relational database with a system that encrypts each record with an encryption key according to its security class level. This model will help you utilize an encryption system as a second security layer over the multilevel security layer for the database, reduce the multilevel database size, and improve the response time of data retrieval from the multilevel database. Considering instance-based multilevel database security, the book covers relational database access controls and examines concurrency control in multilevel database security systems. It includes database encryption algorithms, simulation programs, and Visual studio and Microsoft SQL Server code.
In this book the author examines 60 GHz and conventional UWB. The book introduces the fundamentals, architectures, and applications of unified ultra wideband devices. The material includes both theory and practice and introduces ultra wideband communication systems and their applications in a systematic manner. The material is written to enable readers to design, analyze, and evaluate UWB communication systems.
Data networking now plays a major role in everyday life and new applications continue to appear at a blinding pace. Yet we still do not have a sound foundation for designing, evaluating and managing these networks. This book covers topics at the intersection of algorithms and networking. It builds a complete picture of the current state of research on Next Generation Networks and the challenges for the years ahead. Particular focus is given to evolving research initiatives and the architecture they propose and implications for networking. Topics: Network design and provisioning, hardware issues, layer-3 algorithms and MPLS, BGP and Inter AS routing, packet processing for routing, security and network management, load balancing, oblivious routing and stochastic algorithms, network coding for multicast, overlay routing for P2P networking and content delivery. This timely volume will be of interest to a broad readership from graduate students to researchers looking to survey recent research its open questions.
The Palm theory and the Loynes theory of stationary systems are the two pillars of the modern approach to queuing. This book, presenting the mathematical foundations of the theory of stationaryqueuing systems, contains a thorough treatment of both of these. This approach helps to clarify the picture, in that it separates the task of obtaining the key system formulas from that of proving convergence to a stationary state and computing its law. The theory is constantly illustrated by classical results and models: Pollaczek-Khintchin and Tacacs formulas, Jackson and Gordon-Newell networks, multiserver queues, blocking queues, loss systems etc., but it also contains recent and significant examples, where the tools developed turn out to be indispensable. Several other mathematical tools which are useful within this approach are also presented, such as the martingale calculus for point processes, or stochastic ordering for stationary recurrences. This thoroughly revised second edition contains substantial additions - in particular, exercises and their solutions - rendering this now classic reference suitable for use as a textbook.
"More often than not, it is becoming increasingly evident that the weakest links in the information-security chain are the people. Due an increase in information security threats, it is imperative for organizations and professionals to learn more on the human nature and social interactions behind those creating the problem. Social and Human Elements of Information Security: Emerging Trends and Countermeasures provides insightful, high-quality research into the social and human aspects of information security. A comprehensive source of the latest trends, issues, and findings in the field, this book fills the missing gap in existing literature by bringing together the most recent work from researchers in the fast and evolving field of information security."
This book tells the story of government-sponsored wiretapping in Britain and the United States from the rise of telephony in the 1870s until the terrorist attacks of 9/11. It pays particular attention to the 1990s, which marked one of the most dramatic turns in the history of telecommunications interception. During that time, fiber optic and satellite networks rapidly replaced the copper-based analogue telephone system that had remained virtually unchanged since the 1870s. That remarkable technological advance facilitated the rise of the networked home computer, cellular telephony, and the Internet, and users hailed the dawn of the digital information age. However, security agencies such as the FBI and MI5 were concerned. Since the emergence of telegraphy in the 1830s, security services could intercept private messages using wiretaps, and this was facilitated by some of the world's largest telecommunications monopolies such as AT&T in the US and British Telecom in the UK. The new, digital networks were incompatible with traditional wiretap technology. To make things more complicated for the security services, these monopolies had been privatized and broken up into smaller companies during the 1980s, and in the new deregulated landscape the agencies had to seek assistance from thousands of startup companies that were often unwilling to help. So for the first time in history, technological and institutional changes posed a threat to the security services' wiretapping activities, and government officials in Washington and London acted quickly to protect their ability to spy, they sought to force the industry to change the very architecture of the digital telecommunications network. This book describes in detail the tense negotiations between governments, the telecommunications industry, and civil liberties groups during an unprecedented moment in history when the above security agencies were unable to wiretap. It reveals for the first time the thoughts of some of the protagonists in these crucial negotiations, and explains why their outcome may have forever altered the trajectory of our information society.
Research on Secure Key Establishment has become very active within the last few years. Secure Key Establishment discusses the problems encountered in this field. This book also introduces several improved protocols with new proofs of security. Secure Key Establishment identifies several variants of the key sharing requirement. Several variants of the widely accepted Bellare and Rogaway (1993) model are covered. A comparative study of the relative strengths of security notions between these variants of the Bellare-Rogaway model and the Canetti-Krawczyk model is included. An integrative framework is proposed that allows protocols to be analyzed in a modified version of the Bellare-Rogaway model using the automated model checker tool. Secure Key Establishment is designed for advanced level students in computer science and mathematics, as a secondary text or reference book. This book is also suitable for practitioners and researchers working for defense agencies or security companies.
This book deals with how to measure innovation in crisis management, drawing on data, case studies, and lessons learnt from different European countries. The aim of this book is to tackle innovation in crisis management through lessons learnt and experiences gained from the implementation of mixed methods through a practitioner-driven approach in a large-scale demonstration project (DRIVER+). It explores innovation from the perspective of the end-users by focusing on the needs and problems they are trying to address through a tool (be it an app, a drone, or a training program) and takes a deep dive into what is needed to understand if and to what extent the tool they have in mind can really bring innovation. This book is a toolkit for readers interested in understanding what needs to be in place to measure innovation: it provides the know-how through examples and best practices. The book will be a valuable source of knowledge for scientists, practitioners, researchers, and postgraduate students studying safety, crisis management, and innovation.
Focusing on the critical role IT plays in organizational development, the book shows how to employ action learning to improve the competitiveness of an organization. Defining the current IT problem from an operational and strategic perspective, it presents a collection of case studies that illustrate key learning issues. It details a dynamic model for effective IT management through adaptive learning techniques-supplying proven educational theories and practices to foster the required changes in your staff. It examines existing organizational learning theories and the historical problems that occurred with companies that have used them, as well as those that have failed to use them.
Technology has been the spark that ignited NATO's interest and commitment to scientific advancement during its history. Since its creation, the Science for Peace and Security (SPS) Programme has been instrumental to NATO's commitment to innovation, science and technological advancement. During the years, SPS has demonstrated a flexible and versatile approach to practical scientific cooperation, and has promoted knowledge-sharing, building capacity, and projected stability outside NATO territory. The priorities addressed by the SPS Programme are aligned with NATO's strategic objectives, and aim to tackle emerging security challenges that require dynamic adaptation for the prevention and mitigation of risks. By addressing priorities such as advanced technologies, hybrid threats, and counter-terrorism, the Programme deals with new, contemporary challenges. On 17-18 September 2019, the SPS Programme gathered at the KU Leuven University a wide number of researchers from a selection of on-going and recently closed SPS projects in the field of security-related advanced technologies for a "Cluster Workshop on Advanced Technologies". The workshop covered, in particular, the following scientific domains: communication systems, advanced materials, sensors and detectors, and unmanned and autonomous systems. This book provides an overview on how these projects have contributed to the development of new technologies and innovative solutions and recommendations for future actions in the NATO SPS programme.
The book is compilation of technical papers presented at International Research Symposium on Computing and Network Sustainability (IRSCNS 2016) held in Goa, India on 1st and 2nd July 2016. The areas covered in the book are sustainable computing and security, sustainable systems and technologies, sustainable methodologies and applications, sustainable networks applications and solutions, user-centered services and systems and mobile data management. The novel and recent technologies presented in the book are going to be helpful for researchers and industries in their advanced works.
Advanced Wired and Wireless Networks brings the reader a sample of recent research efforts representative of advances in the areas of recognized importance for the future Internet, such as ad hoc networking, mobility support and performance improvements in advanced networks and protocols. Advanced Wired and Wireless Networks is structured to meet the needs of a professional audience in industry, as well as graduate-level students in computer science and engineering.
Like many other scientific innovations, scientists are looking to protect the internet of things (IoT) from unfortunate losses, theft, or misuse. As one of the current hot trends in the digital world, blockchain technology could be the solution for securing the IoT. Blockchain Applications in IoT Security presents research for understanding IoT-generated data security issues, existing security facilities and their limitations and future possibilities, and the role of blockchain technology. Featuring coverage on a broad range of topics such as cryptocurrency, remote monitoring, and smart computing, this book is ideally designed for security analysts, IT specialists, entrepreneurs, business professionals, academicians, researchers, students, and industry professionals seeking current studies on the limitations and possibilities behind competitive blockchain technologies.
This book applies the concept of synchronization to security of global heterogeneous and hetero-standard systems by modeling the relationship of risk access spots (RAS) between advanced and developing economies network platforms. The proposed model is more effective in securing the electronic security gap between these economies with reference to real life applications, such as electronic fund transfer in electronic business. This process involves the identification of vulnerabilities on communication networks. This book also presents a model and simulation of an integrated approach to security and risk known as Service Server Transmission Model (SSTM).
Big data technologies are used to achieve any type of analytics in a fast and predictable way, thus enabling better human and machine level decision making. Principles of distributed computing are the keys to big data technologies and analytics. The mechanisms related to data storage, data access, data transfer, visualization and predictive modeling using distributed processing in multiple low cost machines are the key considerations that make big data analytics possible within stipulated cost and time practical for consumption by human and machines. However, the current literature available in big data analytics needs a holistic perspective to highlight the relation between big data analytics and distributed processing for ease of understanding and practitioner use. This book fills the literature gap by addressing key aspects of distributed processing in big data analytics. The chapters tackle the essential concepts and patterns of distributed computing widely used in big data analytics. This book discusses also covers the main technologies which support distributed processing. Finally, this book provides insight into applications of big data analytics, highlighting how principles of distributed computing are used in those situations. Practitioners and researchers alike will find this book a valuable tool for their work, helping them to select the appropriate technologies, while understanding the inherent strengths and drawbacks of those technologies.
This book reviews existing operational software failure analysis techniques and proposes near-miss analysis as a novel, and new technique for investigating and preventing software failures. The authors provide details on how near-miss analysis techniques focus on the time-window before the software failure actually unfolds, so as to detect the high-risk conditions that can lead to a major failure. They detail how by alerting system users of an upcoming software failure, the detection of near misses provides an opportunity to collect at runtime failure-related data that is complete and relevant. They present a near-miss management systems (NMS) for detecting upcoming software failures, which can contribute significantly to the improvement of the accuracy of the software failure analysis. A prototype of the NMS is implemented and is discussed in the book. The authors give a practical hands-on approach towards doing software failure investigations by means of near-miss analysis that is of use to industry and academia
This book introduces context-aware computing, providing definitions, categories, characteristics, and context awareness itself and discussing its applications with a particular focus on smart learning environments. It also examines the elements of a context-aware system, including acquisition, modelling, reasoning, and distribution of context. It also reviews applications of context-aware computing - both past and present - to offer readers the knowledge needed to critically analyse how context awareness can be put to use. It is particularly to those new to the subject area who are interested in learning how to develop context-aware computing-oriented applications, as well as postgraduates and researchers in computer engineering, communications engineering related areas of information technology (IT). Further it provides practical know-how for professionals working in IT support and technology, consultants and business decision-makers and those working in the medical, human, and social sciences.
Internet heterogeneity is driving a new challenge in application development: adaptive software. Together with the increased Internet capacity and new access technologies, network congestion and the use of older technologies, wireless access, and peer-to-peer networking are increasing the heterogeneity of the Internet. Applications should provide gracefully degraded levels of service when network conditions are poor, and enhanced services when network conditions exceed expectations. Existing adaptive technologies, which are primarily end-to-end or proxy-based and often focus on a single deficient link, can perform poorly in heterogeneous networks. Instead, heterogeneous networks frequently require multiple, coordinated, and distributed remedial actions. Conductor: Distributed Adaptation for Heterogeneous Networks describes a new approach to graceful degradation in the face of network heterogeneity - distributed adaptation - in which adaptive code is deployed at multiple points within a network. The feasibility of this approach is demonstrated by conductor, a middleware framework that enables distributed adaptation of connection-oriented, application-level protocols. By adapting protocols, conductor provides application-transparent adaptation, supporting both existing applications and applications designed with adaptation in mind. Conductor: Distributed Adaptation for Heterogeneous Networks introduces new techniques that enable distributed adaptation, making it automatic, reliable, and secure. In particular, we introduce the notion of semantic segmentation, which maintains exactly-once delivery of the semantic elements of a data stream while allowing the stream to be arbitrarily adapted in transit. We also introduce a secure architecture for automatic adaptor selection, protecting user data from unauthorized adaptation. These techniques are described both in the context of conductor and in the broader context of distributed systems. Finally, this book presents empirical evidence from several case studies indicating that distributed adaptation can allow applications to degrade gracefully in heterogeneous networks, providing a higher quality of service to users than other adaptive techniques. Further, experimental results indicate that the proposed techniques can be employed without excessive cost. Thus, distributed adaptation is both practical and beneficial. Conductor: Distributed Adaptation for Heterogeneous Networks is designed to meet the needs of a professional audience composed of researchers and practitioners in industry and graduate-level students in computer science.
This book comprises peer-reviewed contributions presented at the 5th International Conference on Electronics, Communications and Networks (CECNet 2015), held in Shanghai, China, 12-15 December, 2015. It includes new multi-disciplinary topics spanning a unique depth and breadth of cutting-edge research areas in Electronic Engineering, Communications and Networks, and Computer Technology. More generally, it is of interest to academics, students and professionals involved in Consumer Electronics Technology, Communication Engineering and Technology, Wireless Communication Systems and Technology, and Computer Engineering and Technology.
This book offers means to handle interference as a central problem of operating wireless networks. It investigates centralized and decentralized methods to avoid and handle interference as well as approaches that resolve interference constructively. The latter type of approach tries to solve the joint detection and estimation problem of several data streams that share a common medium. In fact, an exciting insight into the operation of networks is that it may be beneficial, in terms of an overall throughput, to actively create and manage interference. Thus, when handled properly, "mixing" of data in networks becomes a useful tool of operation rather than the nuisance as which it has been treated traditionally. With the development of mobile, robust, ubiquitous, reliable and instantaneous communication being a driving and enabling factor of an information centric economy, the understanding, mitigation and exploitation of interference in networks must be seen as a centrally important task. |
You may like...
CompTIA Security+ Study Guide - Exam…
Mike Chapple, David Seidl
Paperback
Community Engagement in the Online Space
Michelle Dennis, James H Albert
Hardcover
R5,670
Discovery Miles 56 700
Network+ Guide to Networks
Jill West, Jean Andrews, …
Paperback
|