![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer communications & networking
Recently, cryptology problems, such as designing good cryptographic systems and analyzing them, have been challenging researchers. Many algorithms that take advantage of approaches based on computational intelligence techniques, such as genetic algorithms, genetic programming, and so on, have been proposed to solve these issues. Implementing Computational Intelligence Techniques for Security Systems Design is an essential research book that explores the application of computational intelligence and other advanced techniques in information security, which will contribute to a better understanding of the factors that influence successful security systems design. Featuring a range of topics such as encryption, self-healing systems, and cyber fraud, this book is ideal for security analysts, IT specialists, computer engineers, software developers, technologists, academicians, researchers, practitioners, and students.
The primary objective of this book is to teach the architectures, design principles, and troubleshooting techniques of a LAN. This will be imparted through the presentation of a broad scope of data and computer communication standards, real-world inter-networking techniques, architectures, hardware, software, protocols, technologies and services as they relate to the design, implementation and troubleshooting of a LAN. The logical and physical design of hardware and software is not the only process involved in the design and implementation of a LAN. The latter also encompasses many other aspects including making the business case, compiling the requirements, choosing the technology, planning for capacity, selecting the vendor, and weighing all the issues before the actual design begins.
Software design is becoming increasingly complex and difficult as we move to applications that support people interacting with information and with each other over networks. Computer supported cooperative work applications are a typical example of this. The problems to be solved are no longer just technical, they are also social: how do we build systems that meet the real needs of the people who are asked to use them and that fit into their contexts of use. We can characterise these as wicked problems, where our traditional software engineering techniques for understanding requirements and driving these through into design are no longer adequate. This book presents the Locales Framework - and its five aspects of locale foundations, civic structures, individual views, interaction trajectory and mutuality - as a way of dealing with the intertwined problem-solution space of wicked problems. A locale is based on a metaphor of place as the lived relationship between people and the spaces and resources they use in their interactions. The Locales Framework provides a coherent mediating framework for ethnographers, designers, and software engineers to facilitate both understanding requirements of complex social situations and designing solutions to support these situations in all their complexity.
Brings you up to speed on mobile data system design, current and emerging wireless network and systems standards, and network architectures. Describes mobile data applications and wireless LANs, and analyzes and evaluates current technologies.
Microsoft Exchange 2000 Infrastructure Design explains from a
system designer's and administrator's perspective Microsoft's
Active Directory and its interaction with Exchange 2000, details
issues concerned with migration to Exchange 2000, and outlines the
specific technology and design issues relating to connectivity with
Exchange 2000. Readers will learn to use these technologies to
seamlessly co-exist with their current environment, migrate to a
native Exchange 2000 environment, and connect to the Internet as
well as to other messaging systems. The book's blend of expert
instruction and best practices will help any organization create
optimal system designs and configurations to support different
technical and business scenarios.
This book gives an overview of constraint satisfaction problems (CSPs), adapts related search algorithms and consistency algorithms for applications to multi-agent systems, and consolidates recent research devoted to cooperation in such systems. The techniques introduced are applied to various problems in multi-agent systems. Among the new approaches is a hybrid-type algorithm for weak-commitment search combining backtracking and iterative improvement; also, an extension of the basic CSP formalization called partial CSP is introduced in order to handle over-constrained CSPs.The book is written for advanced students and professionals interested in multi-agent systems or, more generally, in distributed artificial intelligence and constraint satisfaction. Researchers active in the area will appreciate this book as a valuable source of reference.
This book describes the struggle to introduce a mechanism that enables next-generation information systems to maintain themselves. Our generation observed the birth and growth of information systems, and the Internet in particular. Surprisingly information systems are quite different from conventional (energy, material-intensive) artificial systems, and rather resemble biological systems (information-intensive systems). Many artificial systems are designed based on (Newtonian) physics assuming that every element obeys simple and static rules; however, the experience of the Internet suggests a different way of designing where growth cannot be controlled but self-organized with autonomous and selfish agents. This book suggests using game theory, a mechanism design in particular, for designing next-generation information systems which will be self-organized by collective acts with autonomous components. The challenge of mapping a probability to time appears repeatedly in many forms throughout this book. The book contains interdisciplinary research encompassing game theory, complex systems, reliability theory and particle physics. All devoted to its central theme: what happens if systems self-repair themselves?
Despite the complexity of the subject, this wealth of information
is presented succinctly and in such a way, using tables, diagrams
and brief explanatory text, as to allow the user to locate
information quickly and easily. Thus the book should be invaluable
to those involved with the installation, commissioning and
maintenance of data communications equipment, as well as the end
user.
Lego robots Mindstorms are sweeping the world and fans need to
learn how to programme them
Safety is a paradoxical system property. It remains immaterial, intangible and invisible until a failure, an accident or a catastrophy occurs and, too late, reveals its absence. And yet, a system cannot be relied upon unless its safety can be explained, demonstrated and certified. The practical and difficult questions which motivate this study concern the evidence and the arguments needed to justify the safety of a computer based system, or more generally its dependability. Dependability is a broad concept integrating properties such as safety, reliability, availability, maintainability and other related characteristics of the behaviour of a system in operation. How can we give the users the assurance that the system enjoys the required dependability? How should evidence be presented to certification bodies or regulatory authorities? What best practices should be applied? How should we decide whether there is enough evidence to justify the release of the system? To help answer these daunting questions, a method and a framework are proposed for the justification of the dependability of a computer-based system. The approach specifically aims at dealing with the difficulties raised by the validation of software. Hence, it should be of wide applicability despite being mainly based on the experience of assessing Nuclear Power Plant instrumentation and control systems important to safety. To be viable, a method must rest on a sound theoretical background.
The telecommunications industry is experiencing a worldwide explosion of growth as few other industries ever have. However, as recently as a decade ago, the bulk of telecommunications services were delivered by the traditional telephone network, for which design and analysis principles had been under steady development for over three-quarters of a century. This environment was characterized by moderate and steady growth, with an accompanying slower development of new network equipment and standardization processes. In such a near-static environment, attention was given to optimization techniques to squeeze out better profits from existing and limited future investments. To this end, forecasts of network services were developed on a regular planning cycle and networks were optimized accordingly, layer by layer, for cost-effective placement of capacity and efficient utilization. In particular, optimization was based on a fairly stable set of assumptions about the network architecture, equipment models, and forecast uncertainty. This special edition is devoted to heuristic approaches for telecommunications network management, planning, and expansion. We hope that this collection brings to the attention of researchers and practitioners an array of techniques and case studies that meet the stringent time to market' requirements of this industry and which deserve exposure to a wider audience. Telecommunications will face a tremendous challenge in the coming years to be able to design, build, and manage networks in such a rapidly evolving industry. Development and application of heuristic methods will be fundamental in our ability to meet this challenge.
Over one billion people access the Internet worldwide, and new problems of language, security, and culture accompany this new excess in access. Computer-Mediated Communication across Cultures: International Interactions in Online Environments provides readers with the foundational knowledge needed to communicate safely and effectively with individuals from other countries and cultures via online media. Through a closer examination of the expanded global access to the Web, this book discusses the use and design of cross-cultural digital media and the future of the field for executives, marketers, researchers, educators, and the average user.
The Palm theory and the Loynes theory of stationary systems are the two pillars of the modern approach to queuing. This book, presenting the mathematical foundations of the theory of stationaryqueuing systems, contains a thorough treatment of both of these. This approach helps to clarify the picture, in that it separates the task of obtaining the key system formulas from that of proving convergence to a stationary state and computing its law. The theory is constantly illustrated by classical results and models: Pollaczek-Khintchin and Tacacs formulas, Jackson and Gordon-Newell networks, multiserver queues, blocking queues, loss systems etc., but it also contains recent and significant examples, where the tools developed turn out to be indispensable. Several other mathematical tools which are useful within this approach are also presented, such as the martingale calculus for point processes, or stochastic ordering for stationary recurrences. This thoroughly revised second edition contains substantial additions - in particular, exercises and their solutions - rendering this now classic reference suitable for use as a textbook.
Data networking now plays a major role in everyday life and new applications continue to appear at a blinding pace. Yet we still do not have a sound foundation for designing, evaluating and managing these networks. This book covers topics at the intersection of algorithms and networking. It builds a complete picture of the current state of research on Next Generation Networks and the challenges for the years ahead. Particular focus is given to evolving research initiatives and the architecture they propose and implications for networking. Topics: Network design and provisioning, hardware issues, layer-3 algorithms and MPLS, BGP and Inter AS routing, packet processing for routing, security and network management, load balancing, oblivious routing and stochastic algorithms, network coding for multicast, overlay routing for P2P networking and content delivery. This timely volume will be of interest to a broad readership from graduate students to researchers looking to survey recent research its open questions.
This book gathers visionary ideas from leading academics and scientists to predict the future of wireless communication and enabling technologies in 2050 and beyond. The content combines a wealth of illustrations, tables, business models, and novel approaches to the evolution of wireless communication. The book also provides glimpses into the future of emerging technologies, end-to-end systems, and entrepreneurial and business models, broadening readers' understanding of potential future advances in the field and their influence on society at large
Computer Networks & Communications (NetCom) is the proceedings from the Fourth International Conference on Networks & Communications. This book covers theory, methodology and applications of computer networks, network protocols and wireless networks, data communication technologies, and network security. The proceedings will feature peer-reviewed papers that illustrate research results, projects, surveys and industrial experiences that describe significant advances in the diverse areas of computer networks & communications.
Blockchain technologies, as an emerging distributed architecture and computing paradigm, have accelerated the development/application of the Cloud/GPU/Edge Computing, Artificial Intelligence, cyber physical systems, social networking, crowdsourcing and crowdsensing, 5G, trust management, and finance. The popularity and rapid development of Blockchain brings many technical and regulatory challenges for research and academic communities. This book will feature contributions from experts on topics related to performance, benchmarking, durability, robustness, as well data gathering and management, algorithms, analytics techniques for transactions processing, and implementation of applications.
In this book the author examines 60 GHz and conventional UWB. The book introduces the fundamentals, architectures, and applications of unified ultra wideband devices. The material includes both theory and practice and introduces ultra wideband communication systems and their applications in a systematic manner. The material is written to enable readers to design, analyze, and evaluate UWB communication systems.
Mission-Critical Microsoft Exchange 2000 is the definitive book on
how to design and maintain extremely reliable and adaptive Exchange
Server messaging systems that rarely crash and that preserve
valuable data and services in spite of technical disruptions.
E-mail systems are now a primary means of communication for
organizations, which can afford e-mail down-time no more than they
can afford to be without phones. Further, messaging systems
increasingly are supporting vital applications in addition to
e-mail, such as workflow and knowledge management, making the data
they store both voluminous and incredibly valuable.
Software Defined Radio makes wireless communications easier, more efficient, and more reliable. This book bridges the gap between academic research and practical implementation. When beginning a project, practicing engineers, technical managers, and graduate students can save countless hours by considering the concepts presented in these pages.The author covers the myriad options and trade-offs available when selecting an appropriate hardware architecture.As demonstrated here, the choice between hardware- and software-centric architecture can mean the difference between meeting an aggressive schedule and bogging down in endless design iterations.Because of the author's experience overseeing dozens of failed and successful developments, he is able to present many real-life examples.Some of the key concepts covered are: Choosing the right architecture for the market - laboratory, military, or commercial, Hardware platforms - FPGAs, GPPs, specialized and hybrid devices, Standardization efforts to ensure interoperability and portabilitym State-of-the-art components for radio frequency, mixed-signal, and baseband processing. The text requires only minimal knowledge of wireless communications; whenever possible, qualitative arguments are used instead of equations.An appendix provides a quick overview of wireless communications and introduces most of the concepts the readers will need to take advantage of the material.An essential introduction to SDR, this book is sure to be an invaluable addition to any technical bookshelf."
This book describes state-of-the-art approaches to Fog Computing, including the background of innovations achieved in recent years. Coverage includes various aspects of fog computing architectures for Internet of Things, driving reasons, variations and case studies. The authors discuss in detail key topics, such as meeting low latency and real-time requirements of applications, interoperability, federation and heterogeneous computing, energy efficiency and mobility, fog and cloud interplay, geo-distribution and location awareness, and case studies in healthcare and smart space applications.
"More often than not, it is becoming increasingly evident that the weakest links in the information-security chain are the people. Due an increase in information security threats, it is imperative for organizations and professionals to learn more on the human nature and social interactions behind those creating the problem. Social and Human Elements of Information Security: Emerging Trends and Countermeasures provides insightful, high-quality research into the social and human aspects of information security. A comprehensive source of the latest trends, issues, and findings in the field, this book fills the missing gap in existing literature by bringing together the most recent work from researchers in the fast and evolving field of information security."
Research on Secure Key Establishment has become very active within the last few years. Secure Key Establishment discusses the problems encountered in this field. This book also introduces several improved protocols with new proofs of security. Secure Key Establishment identifies several variants of the key sharing requirement. Several variants of the widely accepted Bellare and Rogaway (1993) model are covered. A comparative study of the relative strengths of security notions between these variants of the Bellare-Rogaway model and the Canetti-Krawczyk model is included. An integrative framework is proposed that allows protocols to be analyzed in a modified version of the Bellare-Rogaway model using the automated model checker tool. Secure Key Establishment is designed for advanced level students in computer science and mathematics, as a secondary text or reference book. This book is also suitable for practitioners and researchers working for defense agencies or security companies. |
You may like...
Practical Industrial Data Networks…
Steve Mackay, Edwin Wright, …
Paperback
R1,452
Discovery Miles 14 520
Practical Modern SCADA Protocols - DNP3…
Gordon Clarke, Deon Reynders
Paperback
R1,469
Discovery Miles 14 690
|