![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer communications & networking
User authentication is the process of verifying whether the identity of a user is genuine prior to granting him or her access to resources or services in a secured environment. Traditionally, user authentication is performed statically at the point of entry of the system; however, continuous authentication (CA) seeks to address the shortcomings of this method by providing increased session security and combating insider threat. Continuous Authentication Using Biometrics: Data, Models, and Metrics presents chapters on continuous authentication using biometrics that have been contributed by the leading experts in this recent, fast growing research area. These chapters collectively provide a thorough and concise introduction to the field of biometric-based continuous authentication. The book covers the conceptual framework underlying continuous authentication and presents detailed processing models for various types of practical continuous authentication applications.
IP technology has progressed from being a scientific topic to being one of the most popular technologies in networking. Concurrently, a number of new innovations and technological advances have been developed and brought to the marketplace. These new ideas, concepts and products are likely to have a tremendous influence on businesses and on our everyday lives. This book addresses many of these newer technological developments and provides insights for engineers and scientists developing new technological components, devices and products. explores how they are being implemented in the real world. The author examines numerous implementation details related to IP equipment and software. The material is organized by application so that readers can better understand the uses of IP technology. Included are details of implementation issues and state-of-the-art equipment and software. descriptions of Cisco 12410 GSR and Juniper M160, and IP software stack details are also included for several popular operating systems such as Windows, BSD, VxWorks and Linux.
Metropolitan Area WDM Networks: An AWG Based Approach provides a comprehensive and technically detailed overview of the latest metropolitan area WDM network experimental systems, architectures, and access protocols. Its main focus is on the novel star WDM networks based on a wavelength-selective Arrayed-Waveguide Grating (AWG). Network researchers, engineers, professionals, and graduate students will benefit from the thorough overview and gain an in-depth understanding of current and next-generation metro WDM networks. The AWG based metro star WDM network is discussed at length and
extensively investigated by means of stochastic analyses and
simulations. The book provides:
Security is the science and technology of secure communications and resource protection from security violation such as unauthorized access and modification. Putting proper security in place gives us many advantages. It lets us exchange confidential information and keep it confidential. We can be sure that a piece of information received has not been changed. Nobody can deny sending or receiving a piece of information. We can control which piece of information can be accessed, and by whom. We can know when a piece of information was accessed, and by whom. Networks and databases are guarded against unauthorized access. We have seen the rapid development of the Internet and also increasing security requirements in information networks, databases, systems, and other information resources. This comprehensive book responds to increasing security needs in the marketplace, and covers networking security and standards. There are three types of readers who are interested in security: non-technical readers, general technical readers who do not implement security, and technical readers who actually implement security. This book serves all three by providing a comprehensive explanation of fundamental issues of networking security, concept and principle of security standards, and a description of some emerging security technologies. The approach is to answer the following questions: 1. What are common security problems and how can we address them? 2. What are the algorithms, standards, and technologies that can solve common security problems? 3.
This book provides a timely overview of the impacts of digitalization from the perspective of everyday life, and argues that one central issue in digitalization is the development of new types of services that digitalization enables, but which are often overlooked due to the focus on new technologies and devices. The book summarizes the past 20 years of research into the relationship between information and communications technology (ICT) and service innovation, and reveals that the ongoing digitalization is a qualitatively different phenomenon and represents a true paradigm shift. The all-encompassing integration and distribution of data raises critical issues such as preserving human dignity and individual autonomy; moreover, interaction practices that foster broad participation, trust, learning, and a willingness to share knowledge are called for. Citizen empowerment and multi-actor co-creation have become central to using digitalization to support the development of wellbeing and sustainability. Further, the book shows how employees and professionals can and should be involved in designing their future work, and in evaluating it. Proactiveness and participation in innovation endeavours are ways to guarantee meaningful work in an age of socio-technical transition. The book employs a variety of theoretical approaches and perspectives from diverse disciplines to illustrate these needs. In addition to theoretical analyses, some specific application areas are examined, e.g. services in health and social care, and problems linked to robots in elderly care. Given its scope, the book is highly recommended to all readers seeking an overview of the current understanding of the human side of digitalization and searching for concrete cases from different countries to illustrate the topic.
Introduction: Background and Status. Design before Evaluation. Prerequisite Knowledge Areas: Supportive Tools and Techniques. Interface Structures. Basic Measures. Measurement and Evaluation: Evaluation Terms and Aspects. Tailored Measures of Performance. Evaluation Approaches and Methods. Special Topics: Stress and User Satisfaction. Visualizable Objects and Spaces. Interaction and Mental Involvement. Structural Specification and Utility. Index.
The information infrastructure - comprising computers, embedded devices, networks and software systems - is vital to day-to-day operations in every sector: information and telecommunications, banking and finance, energy, chemicals and hazardous materials, agriculture, food, water, public health, emergency services, transportation, postal and shipping, government and defense. Global business and industry, governments, indeed society itself, cannot function effectively if major components of the critical information infrastructure are degraded, disabled or destroyed. Critical Infrastructure Protection describes original research results and innovative applications in the interdisciplinary field of critical infrastructure protection. Also, it highlights the importance of weaving science, technology and policy in crafting sophisticated, yet practical, solutions that will help secure information, computer and network assets in the various critical infrastructure sectors. Areas of coverage include: themes and issues; control systems security; infrastructure modeling and simulation; risk and impact assessment. This book is the tenth volume in the annual series produced by the International Federation for Information Processing (IFIP) Working Group 11.10 on Critical Infrastructure Protection, an international community of scientists, engineers, practitioners and policy makers dedicated to advancing research, development and implementation efforts focused on infrastructure protection. The book contains a selection of fourteen edited papers from the Tenth Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection, held at SRI International, Arlington, Virginia, USA in the spring of 2016. Critical Infrastructure Protection is an important resource for researchers, faculty members and graduate students, as well as for policy makers, practitioners and other individuals with interests in homeland security.
Three important technology issues face professionals in today's business, education, and government world. In "Privacy, Identity, and Cloud Computing, " author and computer expert Dr. Harry Katzan Jr. addresses the subjects of privacy and identity as they relate to the new discipline of cloud computing, a model for providing on-demand access to computing service via the Internet. A compendium of eight far-reaching papers, "Privacy, Identity, and Cloud Computing" thoroughly dissects and discusses the following: The privacy of cloud computing Identity as a service Identity analytics and belief structures Compatibility relations in identity analysis Conspectus of cloud computing Cloud computing economics: Democratization and monetization of services Ontological view of cloud computing Privacy as a service Katzan provides not only a wealth of information, but gives exposure to these topics facing today's computer users. Ultimately, these are important facets of modern computing, and all their implications must be considered thoroughly in anticipation of future developments.
Collaborative Networks is a fast developing area, as shown by the already large number of diverse real-world implemented cases and the dynamism of its related involved research community. Benefiting from contributions of multiple areas, nameley management, economy, social sciences, law and ethics, etc., the area of Collaborative Networs is being consolidated as a new scientific discipline of its own. On one hand significant steps towards a stronger theoretical foundation for this new discipline are developed and applied in industry and services. Based on the experiences and lessons learned in many research projects and pilot cases developed during the last decade, a new emphasis is now being put on the development of holistic frameworks, combining business models, conceptual models, governance principles and methods, as well as supporting infrastructures and services. In fact, researching the phase in which the computer and networking technologies provide a good starting basis for the establishment of collaborative platforms, the emphasis is now turning to the understanding of the collaboration promotion mechanisms and CN governance principles. Therefore, issues such as the value systems, trust, performance and benefits distribution are gaining more importance. Encompassing all these developments, the efforts to develp reference models for collaborative networks represent a major challenge in order to provide the foundation for further developments of the CN. PRO-VE represents a good synthesis of the work in this area, and plays an active role in the promotion of these activities. Being recognized as the most focused scientific and technical conference on CollaborativeNetworks, PRO-VE continues to offer the opportunity for presentation and discussion of both the latest research developments as well as the practical application case studies. Following the vision of IFIP and SOCOLNET, the PRO-VE conference offers a forum for collaboration and knowledge exchange among experts from different regions of the world.
Lo, soul! seest thou not God's purpose from the first? The earth to be spann'd, connected by net-work From Passage to India! Walt Whitman, "Leaves of Grass", 1900. The Internet is growing at a tremendous rate today. New services, such as telephony and multimedia, are being added to the pure data-delivery framework of yesterday. Such high demands on capacity could lead to a "bandwidth-crunch" at the core wide-area network resulting in degra dation of service quality. Fortunately, technological innovations have emerged which can provide relief to the end-user to overcome the In ternet's well-known delay and bandwidth limitations. At the physical layer, a major overhaul of existing networks has been envisaged from electronic media (such as twisted-pair and cable) to optical fibers - in the wide area, in the metropolitan area, and even in the local area set tings. In order to exploit the immense bandwidth potential of the optical fiber, interesting multiplexing techniques have been developed over the years. Wavelength division multiplexing (WDM) is such a promising tech nique in which multiple channels are operated along a single fiber si multaneously, each on a different wavelength. These channels can be independently modulated to accommodate dissimilar bit rates and data formats, if so desired. Thus, WDM carves up the huge bandwidth of an optical fiber into channels whose bandwidths (1-10 Gbps) are compati ble with peak electronic processing speed.
This book provides an original graph theoretical approach to the fundamental properties of wireless mobile ad-hoc networks. This approach is combined with a realistic radio model for physical links between nodes to produce new insight into network characteristics like connectivity, degree distribution, hopcount, interference and capacity. The book establishes directives for designing ad-hoc networks and sensor networks. It will interest the academic community, and engineers who roll out ad-hoc and sensor networks.
This year, the IFIP Working Conference on Distributed and Parallel Embedded Sys tems (DIPES 2008) is held as part of the IFIP World Computer Congress, held in Milan on September 7 10, 2008. The embedded systems world has a great deal of experience with parallel and distributed computing. Many embedded computing systems require the high performance that can be delivered by parallel computing. Parallel and distributed computing are often the only ways to deliver adequate real time performance at low power levels. This year's conference attracted 30 submissions, of which 21 were accepted. Prof. Jor ] g Henkel of the University of Karlsruhe graciously contributed a keynote address on embedded computing and reliability. We would like to thank all of the program committee members for their diligence. Wayne Wolf, Bernd Kleinjohann, and Lisa Kleinjohann Acknowledgements We would like to thank all people involved in the organization of the IFIP World Computer Congress 2008, especially the IPC Co Chairs Judith Bishop and Ivo De Lotto, the Organization Chair Giulio Occhini, as well as the Publications Chair John Impagliazzo. Further thanks go to the authors for their valuable contributions to DIPES 2008. Last but not least we would like to acknowledge the considerable amount of work and enthusiasm spent by our colleague Claudius Stern in preparing theproceedingsofDIPES2008. Hemadeitpossibletoproducethemintheircurrent professional and homogeneous style."
With rapid increase of mobile users of laptop computers and
cellular phones, support of Internet services like e-mail and World
Wide Web (WWW) access in a mobile environment is an indispensable
requirement. The wireless networks must have the ability to provide
real-time bursty traffic (such as voice or video) and data traffic
in a multimedia environment with high quality of service. To
satisfy the huge demand for wireless multimedia service, efficient
channel access methods must be devised. For design and tuning of
the channel access methods, the system performance must be
mathematically analysed. To do so, very accurate models, that
faithfully reproduce the stochastic behaviour of multimedia
wireless communication and computer networks, must be constructed.
The ubiquitous nature of the Internet is enabling a new generation of - pUcations to support collaborative work among geographically distant users. Security in such an environment is of utmost importance to safeguard the pri vacy of the communication and to ensure the integrity of the applications. 'Secure group communications' (SGC) refers to a scenario in which a group of participants can receive and send messages to group members, in a way that outsiders are unable to glean any information even when they are able to intercept the messages. SGC is becoming extremely important for researchers and practitioners because many applications that require SGC are now widely used, such as teleconferencing, tele-medicine, real-time information services, distributed interactive simulations, collaborative work, grid computing, and the deployment of VPN (Virtual Private Networks). Even though considerable research accomplishments have been achieved in SGC, few books exist on this very important topic. The purpose of this book is to provide a comprehensive survey of principles and state-of-the-art techniques for secure group communications over data net works. The book is targeted towards practitioners, researchers and students in the fields of networking, security, and software applications development. The book consists of 7 chapters, which are listed and described as follows."
The interaction paradigm is a new conceptualization of computational phenomena that emphasizes interaction over algorithms, reflecting the shift in technology from main-frame number-crunching to distributed intelligent networks with graphical user interfaces. The book is arranged in four sections: "Introduction," comprising three chapters that explore and summarize the fundamentals of interactive computation; "Theory" with six chapters, each discussing a specific aspect of interaction; "Applications," five chapters showing how this principle is applied in subdisciplines of computer science; and "New Directions," presenting four multidisciplinary applications. The book challenges traditional Turing machine-based answers to fundamental questions of problem solving and the scope of computation.
Chapter 1: Introduction and Overview
Cybersecurity for medical devices is no longer optional. We must not allow sensationalism or headlines to drive the discussion... Nevertheless, we must proceed with urgency. In the end, this is about preventing patient harm and preserving patient trust. A comprehensive guide to medical device secure lifecycle management, this is a book for engineers, managers, and regulatory specialists. Readers gain insight into the security aspects of every phase of the product lifecycle, including concept, design, implementation, supply chain, manufacturing, postmarket surveillance, maintenance, updates, and end of life. Learn how to mitigate or completely avoid common cybersecurity vulnerabilities introduced during development and production. Grow your awareness of cybersecurity development topics ranging from high-level concepts to practical solutions and tools. Get insight into emerging regulatory and customer expectations. Uncover how to minimize schedule impacts and accelerate time-to-market while still accomplishing the main goal: reducing patient and business exposure to cybersecurity risks. Medical Device Cybersecurity for Engineers and Manufacturers is designed to help all stakeholders lead the charge to a better medical device security posture and improve the resilience of our medical device ecosystem.
This book constitutes Part III of the refereed four-volume post-conference proceedings of the 4th IFIP TC 12 International Conference on Computer and Computing Technologies in Agriculture, CCTA 2010, held in Nanchang, China, in October 2010. The 352 revised papers presented were carefully selected from numerous submissions. They cover a wide range of interesting theories and applications of information technology in agriculture, including simulation models and decision-support systems for agricultural production, agricultural product quality testing, traceability and e-commerce technology, the application of information and communication technology in agriculture, and universal information service technology and service systems development in rural areas.
The area of intelligent and adaptive user interfaces has been of interest to the research community for a long time. Much effort has been spent in trying to find a stable theoretical base for adaptivity in human-computer interaction and to build prototypical systems showing features of adaptivity in real-life interfaces. To date research in this field has not led to a coherent view of problems, let alone solutions. A workshop was organized, which brought together a number of well-known researchers in the area of adaptive user interfaces with a view to
This book is the combined proceedings of the latest IFIP Formal Description Techniques (FDTs) and Protocol Specification, Testing and Verification (PSTV) series. It addresses FDTs applicable to communication protocols and distributed systems, with special emphasis on standardised FDTs. It features state-of-the-art in theory, application, tools and industrialisation of formal description.
With the fast development of networking and software technologies, information processing infrastructure and applications have been growing at an impressive rate in both size and complexity, to such a degree that the design and development of high performance and scalable data processing systems and networks have become an ever-challenging issue. As a result, the use of performance modeling and m- surementtechniquesas a critical step in designand developmenthas becomea c- mon practice. Research and developmenton methodologyand tools of performance modeling and performance engineering have gained further importance in order to improve the performance and scalability of these systems. Since the seminal work of A. K. Erlang almost a century ago on the mod- ing of telephone traf c, performance modeling and measurement have grown into a discipline and have been evolving both in their methodologies and in the areas in which they are applied. It is noteworthy that various mathematical techniques were brought into this eld, including in particular probability theory, stochastic processes, statistics, complex analysis, stochastic calculus, stochastic comparison, optimization, control theory, machine learning and information theory. The app- cation areas extended from telephone networks to Internet and Web applications, from computer systems to computer software, from manufacturing systems to s- ply chain, from call centers to workforce management.
Cloud Computing has already been embraced by many organizations and individuals due to its benefits of economy, reliability, scalability and guaranteed quality of service among others. But since the data is not stored, analysed or computed on site, this can open security, privacy, trust and compliance issues. This one-stop reference covers a wide range of issues on data security in Cloud Computing ranging from accountability, to data provenance, identity and risk management. Data Security in Cloud Computing covers major aspects of securing data in Cloud Computing. Topics covered include NOMAD: a framework for ensuring data confidentiality in mission-critical cloud based applications; 3DCrypt: privacy-preserving pre-classification volume ray-casting of 3D images in the cloud; multiprocessor system-on-chip for processing data in Cloud Computing; distributing encoded data for private processing in the cloud; data protection and mobility management for cloud; understanding software defined perimeter; security, trust and privacy for Cloud Computing in transportation cyber-physical systems; review of data leakage attack techniques in cloud systems; Cloud Computing and personal data processing: sorting out legal requirements; the Waikato data privacy matrix; provenance reconstruction in clouds; and security visualization for Cloud Computing.
A global information revolution has begun. Converging
communications and computing technologies are forming information
superhighways, linking people and information interactively, at any
time, in any place, via a combination of multimedia, digital video,
sound, graphics, and text.
Today's advancements in technology have brought about a new era of speed and simplicity for consumers and businesses. Due to these new benefits, the possibilities of universal connectivity, storage and computation are made tangible, thus leading the way to new Internet-of Things solutions. Resource Management and Efficiency in Cloud Computing Environments is an authoritative reference source for the latest scholarly research on the emerging trends of cloud computing and reveals the benefits cloud paths provide to consumers. Featuring coverage across a range of relevant perspectives and topics, such as big data, cloud security, and utility computing, this publication is an essential source for researchers, students and professionals seeking current research on the organization and productivity of cloud computing environments. Topics Covered: Big Data Cloud Application Services (SaaS) Cloud Security Hybrid Cloud Internet of Things (IoT) Private Cloud Public Cloud Service Oriented Architecture (SOA) Utility Computing Virtualization Technology |
You may like...
Cybersecurity Capabilities in Developing…
Maurice Dawson, Oteng Tabona, …
Hardcover
R5,931
Discovery Miles 59 310
Digital Health - Mobile and Wearable…
Shabbir Syed-Abdul, Xinxin Zhu, …
Paperback
R2,525
Discovery Miles 25 250
Optimization of Manufacturing Systems…
Yingfeng Zhang, Fei Tao
Paperback
The Host in the Machine - Examining the…
Angela Thomas-Jones
Paperback
R1,318
Discovery Miles 13 180
|