![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer communications & networking
This book provides an original graph theoretical approach to the fundamental properties of wireless mobile ad-hoc networks. This approach is combined with a realistic radio model for physical links between nodes to produce new insight into network characteristics like connectivity, degree distribution, hopcount, interference and capacity. The book establishes directives for designing ad-hoc networks and sensor networks. It will interest the academic community, and engineers who roll out ad-hoc and sensor networks.
With the fast development of networking and software technologies, information processing infrastructure and applications have been growing at an impressive rate in both size and complexity, to such a degree that the design and development of high performance and scalable data processing systems and networks have become an ever-challenging issue. As a result, the use of performance modeling and m- surementtechniquesas a critical step in designand developmenthas becomea c- mon practice. Research and developmenton methodologyand tools of performance modeling and performance engineering have gained further importance in order to improve the performance and scalability of these systems. Since the seminal work of A. K. Erlang almost a century ago on the mod- ing of telephone traf c, performance modeling and measurement have grown into a discipline and have been evolving both in their methodologies and in the areas in which they are applied. It is noteworthy that various mathematical techniques were brought into this eld, including in particular probability theory, stochastic processes, statistics, complex analysis, stochastic calculus, stochastic comparison, optimization, control theory, machine learning and information theory. The app- cation areas extended from telephone networks to Internet and Web applications, from computer systems to computer software, from manufacturing systems to s- ply chain, from call centers to workforce management.
Security is the science and technology of secure communications and resource protection from security violation such as unauthorized access and modification. Putting proper security in place gives us many advantages. It lets us exchange confidential information and keep it confidential. We can be sure that a piece of information received has not been changed. Nobody can deny sending or receiving a piece of information. We can control which piece of information can be accessed, and by whom. We can know when a piece of information was accessed, and by whom. Networks and databases are guarded against unauthorized access. We have seen the rapid development of the Internet and also increasing security requirements in information networks, databases, systems, and other information resources. This comprehensive book responds to increasing security needs in the marketplace, and covers networking security and standards. There are three types of readers who are interested in security: non-technical readers, general technical readers who do not implement security, and technical readers who actually implement security. This book serves all three by providing a comprehensive explanation of fundamental issues of networking security, concept and principle of security standards, and a description of some emerging security technologies. The approach is to answer the following questions: 1. What are common security problems and how can we address them? 2. What are the algorithms, standards, and technologies that can solve common security problems? 3.
This year, the IFIP Working Conference on Distributed and Parallel Embedded Sys tems (DIPES 2008) is held as part of the IFIP World Computer Congress, held in Milan on September 7 10, 2008. The embedded systems world has a great deal of experience with parallel and distributed computing. Many embedded computing systems require the high performance that can be delivered by parallel computing. Parallel and distributed computing are often the only ways to deliver adequate real time performance at low power levels. This year's conference attracted 30 submissions, of which 21 were accepted. Prof. Jor ] g Henkel of the University of Karlsruhe graciously contributed a keynote address on embedded computing and reliability. We would like to thank all of the program committee members for their diligence. Wayne Wolf, Bernd Kleinjohann, and Lisa Kleinjohann Acknowledgements We would like to thank all people involved in the organization of the IFIP World Computer Congress 2008, especially the IPC Co Chairs Judith Bishop and Ivo De Lotto, the Organization Chair Giulio Occhini, as well as the Publications Chair John Impagliazzo. Further thanks go to the authors for their valuable contributions to DIPES 2008. Last but not least we would like to acknowledge the considerable amount of work and enthusiasm spent by our colleague Claudius Stern in preparing theproceedingsofDIPES2008. Hemadeitpossibletoproducethemintheircurrent professional and homogeneous style."
User authentication is the process of verifying whether the identity of a user is genuine prior to granting him or her access to resources or services in a secured environment. Traditionally, user authentication is performed statically at the point of entry of the system; however, continuous authentication (CA) seeks to address the shortcomings of this method by providing increased session security and combating insider threat. Continuous Authentication Using Biometrics: Data, Models, and Metrics presents chapters on continuous authentication using biometrics that have been contributed by the leading experts in this recent, fast growing research area. These chapters collectively provide a thorough and concise introduction to the field of biometric-based continuous authentication. The book covers the conceptual framework underlying continuous authentication and presents detailed processing models for various types of practical continuous authentication applications.
Metropolitan Area WDM Networks: An AWG Based Approach provides a comprehensive and technically detailed overview of the latest metropolitan area WDM network experimental systems, architectures, and access protocols. Its main focus is on the novel star WDM networks based on a wavelength-selective Arrayed-Waveguide Grating (AWG). Network researchers, engineers, professionals, and graduate students will benefit from the thorough overview and gain an in-depth understanding of current and next-generation metro WDM networks. The AWG based metro star WDM network is discussed at length and
extensively investigated by means of stochastic analyses and
simulations. The book provides:
The interaction paradigm is a new conceptualization of computational phenomena that emphasizes interaction over algorithms, reflecting the shift in technology from main-frame number-crunching to distributed intelligent networks with graphical user interfaces. The book is arranged in four sections: "Introduction," comprising three chapters that explore and summarize the fundamentals of interactive computation; "Theory" with six chapters, each discussing a specific aspect of interaction; "Applications," five chapters showing how this principle is applied in subdisciplines of computer science; and "New Directions," presenting four multidisciplinary applications. The book challenges traditional Turing machine-based answers to fundamental questions of problem solving and the scope of computation.
The ubiquitous nature of the Internet is enabling a new generation of - pUcations to support collaborative work among geographically distant users. Security in such an environment is of utmost importance to safeguard the pri vacy of the communication and to ensure the integrity of the applications. 'Secure group communications' (SGC) refers to a scenario in which a group of participants can receive and send messages to group members, in a way that outsiders are unable to glean any information even when they are able to intercept the messages. SGC is becoming extremely important for researchers and practitioners because many applications that require SGC are now widely used, such as teleconferencing, tele-medicine, real-time information services, distributed interactive simulations, collaborative work, grid computing, and the deployment of VPN (Virtual Private Networks). Even though considerable research accomplishments have been achieved in SGC, few books exist on this very important topic. The purpose of this book is to provide a comprehensive survey of principles and state-of-the-art techniques for secure group communications over data net works. The book is targeted towards practitioners, researchers and students in the fields of networking, security, and software applications development. The book consists of 7 chapters, which are listed and described as follows."
This book is the combined proceedings of the latest IFIP Formal Description Techniques (FDTs) and Protocol Specification, Testing and Verification (PSTV) series. It addresses FDTs applicable to communication protocols and distributed systems, with special emphasis on standardised FDTs. It features state-of-the-art in theory, application, tools and industrialisation of formal description.
This work is on biometric data indexing for large-scale identification systems with a focus on different biometrics data indexing methods. It provides state-of-the-art coverage including different biometric traits, together with the pros and cons for each. Discussion of different multimodal fusion strategies are also included.
Cybersecurity for medical devices is no longer optional. We must not allow sensationalism or headlines to drive the discussion... Nevertheless, we must proceed with urgency. In the end, this is about preventing patient harm and preserving patient trust. A comprehensive guide to medical device secure lifecycle management, this is a book for engineers, managers, and regulatory specialists. Readers gain insight into the security aspects of every phase of the product lifecycle, including concept, design, implementation, supply chain, manufacturing, postmarket surveillance, maintenance, updates, and end of life. Learn how to mitigate or completely avoid common cybersecurity vulnerabilities introduced during development and production. Grow your awareness of cybersecurity development topics ranging from high-level concepts to practical solutions and tools. Get insight into emerging regulatory and customer expectations. Uncover how to minimize schedule impacts and accelerate time-to-market while still accomplishing the main goal: reducing patient and business exposure to cybersecurity risks. Medical Device Cybersecurity for Engineers and Manufacturers is designed to help all stakeholders lead the charge to a better medical device security posture and improve the resilience of our medical device ecosystem.
With rapid increase of mobile users of laptop computers and
cellular phones, support of Internet services like e-mail and World
Wide Web (WWW) access in a mobile environment is an indispensable
requirement. The wireless networks must have the ability to provide
real-time bursty traffic (such as voice or video) and data traffic
in a multimedia environment with high quality of service. To
satisfy the huge demand for wireless multimedia service, efficient
channel access methods must be devised. For design and tuning of
the channel access methods, the system performance must be
mathematically analysed. To do so, very accurate models, that
faithfully reproduce the stochastic behaviour of multimedia
wireless communication and computer networks, must be constructed.
Three important technology issues face professionals in today's business, education, and government world. In "Privacy, Identity, and Cloud Computing, " author and computer expert Dr. Harry Katzan Jr. addresses the subjects of privacy and identity as they relate to the new discipline of cloud computing, a model for providing on-demand access to computing service via the Internet. A compendium of eight far-reaching papers, "Privacy, Identity, and Cloud Computing" thoroughly dissects and discusses the following: The privacy of cloud computing Identity as a service Identity analytics and belief structures Compatibility relations in identity analysis Conspectus of cloud computing Cloud computing economics: Democratization and monetization of services Ontological view of cloud computing Privacy as a service Katzan provides not only a wealth of information, but gives exposure to these topics facing today's computer users. Ultimately, these are important facets of modern computing, and all their implications must be considered thoroughly in anticipation of future developments.
Communication protocols are rules whereby meaningful communication can be exchanged between different communicating entities. In general, they are complex and difficult to design and implement. Specifications of communication protocols written in a natural language (e.g. English) can be unclear or ambiguous, and may be subject to different interpretations. As a result, independent implementations of the same protocol may be incompatible. In addition, the complexity of protocols make them very hard to analyze in an informal way. There is, therefore, a need for precise and unambiguous specification using some formal languages. Many protocol implementations used in the field have almost suffered from failures, such as deadlocks. When the conditions in which the protocols work correctly have been changed, there has been no general method available for determining how they will work under the new conditions. It is necessary for protocol designers to have techniques and tools to detect errors in the early phase of design, because the later in the process that a fault is discovered, the greater the cost of rectifying it. Protocol verification is a process of checking whether the interactions of protocol entities, according to the protocol specification, do indeed satisfy certain properties or conditions which may be either general (e.g., absence of deadlock) or specific to the particular protocol system directly derived from the specification. In the 80s, an ISO (International Organization for Standardization) working group began a programme of work to develop formal languages which were suitable for Open Systems Interconnection (OSI). This group called such languages Formal Description Techniques (FDTs). Some of the objectives of ISO in developing FDTs were: enabling unambiguous, clear and precise descriptions of OSI protocol standards to be written, and allowing such specifications to be verified for correctness. There are two FDTs standardized by ISO: LOTOS and Estelle. Communication Protocol Specification and Verification is written to address the two issues discussed above: the needs to specify a protocol using an FDT and to verify its correctness in order to uncover specification errors in the early stage of a protocol development process. The readership primarily consists of advanced undergraduate students, postgraduate students, communication software developers, telecommunication engineers, EDP managers, researchers and software engineers. It is intended as an advanced undergraduate or postgraduate textbook, and a reference for communication protocol professionals.
The advent of the digital economy has the potential to dramatically change the conventional interrelationships among individuals, enterprises and society. There can be little doubt that to achieve vigorous socioeconomic developments in the 21st century, people will have to aggressively use information technology to boost innovation and to organically link the results of that innovation to solutions to global environmental issues and social challenges such as the opportunity divide. We are responsible for taking advantage of the opportunities opened up by the digital economy and for turning those opportunities into things that reflect our values and goals. The book examines the overall impact of the digital economy and the development of a practical institutional design.
Today's advancements in technology have brought about a new era of speed and simplicity for consumers and businesses. Due to these new benefits, the possibilities of universal connectivity, storage and computation are made tangible, thus leading the way to new Internet-of Things solutions. Resource Management and Efficiency in Cloud Computing Environments is an authoritative reference source for the latest scholarly research on the emerging trends of cloud computing and reveals the benefits cloud paths provide to consumers. Featuring coverage across a range of relevant perspectives and topics, such as big data, cloud security, and utility computing, this publication is an essential source for researchers, students and professionals seeking current research on the organization and productivity of cloud computing environments. Topics Covered: Big Data Cloud Application Services (SaaS) Cloud Security Hybrid Cloud Internet of Things (IoT) Private Cloud Public Cloud Service Oriented Architecture (SOA) Utility Computing Virtualization Technology
The area of intelligent and adaptive user interfaces has been of interest to the research community for a long time. Much effort has been spent in trying to find a stable theoretical base for adaptivity in human-computer interaction and to build prototypical systems showing features of adaptivity in real-life interfaces. To date research in this field has not led to a coherent view of problems, let alone solutions. A workshop was organized, which brought together a number of well-known researchers in the area of adaptive user interfaces with a view to
Video monitoring has become a vital aspect within the global society as it helps prevent crime, promote safety, and track daily activities such as traffic. As technology in the area continues to improve, it is necessary to evaluate how video is being processed to improve the quality of images. Applied Video Processing in Surveillance and Monitoring Systems investigates emergent techniques in video and image processing by evaluating such topics as segmentation, noise elimination, encryption, and classification. Featuring real-time applications, empirical research, and vital frameworks within the field, this publication is a critical reference source for researchers, professionals, engineers, academicians, advanced-level students, and technology developers.
Wireless Distributed Computing and Cognitive Sensing defines high-dimensional data processing in the context of wireless distributed computing and cognitive sensing. This book presents the challenges that are unique to this area such as synchronization caused by the high mobility of the nodes. The author will discuss the integration of software defined radio implementation and testbed development. The book will also bridge new research results and contextual reviews. Also the author provides an examination of large cognitive radio network; hardware testbed; distributed sensing; and distributed computing.
This book has brought 24 groups of experts and active researchers around the world together in image processing and analysis, video processing and analysis, and communications related processing, to present their newest research results, exchange latest experiences and insights, and explore future directions in these important and rapidly evolving areas. It aims at increasing the synergy between academic and industry professionals working in the related field. It focuses on the state-of-the-art research in various essential areas related to emerging technologies, standards and applications on analysis, processing, computing, and communication of multimedia information. The target audience of this book is researchers and engineers as well as graduate students working in various disciplines linked to multimedia analysis, processing and communications, e.g., computer vision, pattern recognition, information technology, image processing, and artificial intelligence. The book is also meant to a broader audience including practicing professionals working in image/video applications such as image processing, video surveillance, multimedia indexing and retrieval, and so on. We hope that the researchers, engineers, students and other professionals who read this book would find it informative, useful and inspirational toward their own work in one way or another.
This book answers a question which came about while the author was work ing on his diploma thesis [1]: would it be better to ask for the available band width instead of probing the network (like TCP does)? The diploma thesis was concerned with long-distance musical interaction ("NetMusic"). This is a very peculiar application: only a small amount of bandwidth may be necessary, but timely delivery and reduced loss are very important. Back then, these require ments led to a thorough investigation of existing telecommunication network mechanisms, but a satisfactory answer to the question could not be found. Simply put, the answer is "yes" - this work describes a mechanism which indeed enables an application to "ask for the available bandwidth". This obvi ously does not only concern online musical collaboration any longer. Among others, the mechanism yields the following advantages over existing alterna tives: * good throughput while maintaining close to zero loss and a small bottleneck queue length * usefulness for streaming media applications due to a very smooth rate * feasibility for satellite and wireless links * high scalability Additionally, a reusable framework for future applications that need to "ask the network" for certain performance data was developed.
The ubiquitous nature of the Internet of Things allows for enhanced connectivity between people in modern society. When applied to various industries, these current networking capabilities create opportunities for new applications. Internet of Things and Advanced Application in Healthcare is a critical reference source for emerging research on the implementation of the latest networking and technological trends within the healthcare industry. Featuring in-depth coverage across the broad scope of the Internet of Things in specialized settings, such as context-aware computing, reliability, and healthcare support systems, this publication is an ideal resource for professionals, researchers, upper-level students, practitioners, and technology developers seeking innovative material on the Internet of Things and its distinct applications. Topics Covered: Assistive Technologies Context-Aware Computing Systems Health Risk Management Healthcare Support Systems Reliability Concerns Smart Healthcare Wearable Sensors
Written by one of the founding fathers of Quantum Information, this book gives an accessible (albeit mathematically rigorous), self-contained introduction to quantum information theory. The central role is played by the concept of quantum channel and its entropic and information characteristics. In this revised edition, the main results have been updated to reflect the most recent developments in this very active field of research. |
You may like...
Hiking Beyond Cape Town - 40 Inspiring…
Nina du Plessis, Willie Olivier
Paperback
Handbook of Research on E-learning…
Rita de Cassia Veiga Marriott, Patricia Lupion Torres
Hardcover
R6,734
Discovery Miles 67 340
|