![]() |
Welcome to Loot.co.za!
Sign in / Register |Wishlists & Gift Vouchers |Help | Advanced search
|
Your cart is empty |
||
|
Books > Computing & IT > Computer communications & networking > General
Originally published in 1995, Large Deviations for Performance Analysis consists of two synergistic parts. The first half develops the theory of large deviations from the beginning, through recent results on the theory for processes with boundaries, keeping to a very narrow path: continuous-time, discrete-state processes. By developing only what is needed for the applications, the theory is kept to a manageable level, both in terms of length and in terms of difficulty. Within its scope, the treatment is detailed, comprehensive and self-contained. As the book shows, there are sufficiently many interesting applications of jump Markov processes to warrant a special treatment. The second half is a collection of applications developed at Bell Laboratories. The applications cover large areas of the theory of communication networks: circuit switched transmission, packet transmission, multiple access channels, and the M/M/1 queue. Aspects of parallel computation are covered as well including, basics of job allocation, rollback-based parallel simulation, assorted priority queueing models that might be used in performance models of various computer architectures, and asymptotic coupling of processors. These applications are thoroughly analysed using the tools developed in the first half of the book.
Today, it is not uncommon for practices and hospitals to be on their second or third EHR and/or contemplating a transition from the traditional on-premise model to a cloud-based system. As a follow-up to Complete Guide and Toolkit to Successful EHR Adoption ( (c)2011 HIMSS), this book builds on the best practices of the first edition, fast-forwarding to the latest innovations that are currently leveraged and adopted by providers and hospitals. We examine the role that artificial intelligence (AI) is now playing in and around EHR technology. We also address the advances in analytics and deep learning (also known as deep structured or hierarchical learning) and explain this topic in practical ways for even the most novice reader to comprehend and apply. The challenges of EHR to EHR migrations and data conversions will also be covered, including the use of the unethical practice of data blocking used as a tactic by some vendors to hold data hostage. Further, we explore innovations related to interoperability, cloud computing, cyber security, and electronic patient/consumer engagement. Finally, this book will deal with what to do with aging technology and databases, which is an issue rarely considered in any of the early publications on healthcare technology. What is the proper way to retire a legacy system, and what are the legal obligations of data archiving? Though a lot has changed since the 2011 edition, many of the fundamentals remain the same and will serve as a foundation for the next generation of EHR adopters and/or those moving on to their second, third, fourth, and beyond EHRs.
This title includes a number of Open Access chapters. Model-driven engineering (MDE) is the automatic production of software from simplified models of structure and functionality. It mainly involves the automation of the routine and technologically complex programming tasks, thus allowing developers to focus on the true value-adding functionality that the system needs to deliver. This book serves an overview of some of the core topics in MDE. The volume is broken into two sections offering a selection of papers that helps the reader not only understand the MDE principles and techniques, but also learn from practical examples. Also covered are the following topics: * MDE for software product lines * Formal methods for model transformation correctness * Metamodeling with Eclipse eCore * Metamodeling with UML profiles * Test cases generation This easily accessible reference volume offers a comprehensive guide to this rapidly expanding field. Edited by experienced writers with experience in both research and the practice of software engineering, Model-Driven Engineering of Information Systems: Principles, Techniques and Practice is an authoritative and easy-to-use reference, ideal for both researchers in the field and students who wish to gain an overview to this important field of study.
The Internet of Energy (IoE), with the integration of advanced information and communication technologies (ICT), has led to a transformation of traditional networks to smart systems. Internet of Energy Handbook provides updated knowledge in the field of energy management with an Internet of Things (IoT) perspective. Features Explains the technological developments for energy management leading to a reduction in energy consumption through topics like smart energy systems, smart sensors, communication, techniques, and utilization Includes dedicated sections covering varied aspects related to renewable sources of energy, power distribution, and generation Incorporates energy efficiency, optimization, and sensor technologies Covers multidisciplinary aspects in computational intelligence and IoT Discusses building energy management aspects including temperature, humidity, the number of persons involved, and light intensity This handbook is aimed at graduate students, researchers, and professionals interested in power systems, IoT, smart grids, electrical engineering, and transmission.
A staggering 70% of digital transformations have failed as per McKinsey. The key reason why enterprises are failing in their digital transformation journey is because there is no standard framework existing in the industry that enterprises can use to transform themselves to digital. There are several books that speak about technologies such as Cloud, Artificial Intelligence and Data Analytics in silos, but none of these provides a holistic view on how enterprises can embark on a digital transformation journey and be successful using a combination of these technologies. FORMULA 4.0 is a methodology that provides clear guidance for enterprises aspiring to transform their traditional operating model to digital. Enterprises can use this framework as a readymade guide and plan their digital transformation journey. This book is intended for all chief executives, software managers, and leaders who intend to successfully lead this digital transformation journey. An enterprise can achieve success in digital transformation only of it can create an IT Platform that will enable them to adopt any new technology seamlessly into existing IT estate; deliver new products and services to the market in shorter durations; make business decisions with IT as an enabler and utilize automation in all its major business and IT processes. Achieving these goals is what defines a digital enterprise -- Formula 4.0 is a methodology for enterprises to achieve these goals and become digital. Essentially, there is no existing framework in the market that provides a step-by-step guide to enterprises on how to embark on their successful digital transformation journey. This book enables such transformations. Overall, the Formula 4.0 is an enterprise digital transformation framework that enables organizations to become truly digital.
Modern computing relies on future and emergent technologies which have been conceived via interaction between computer science, engineering, chemistry, physics and biology. This highly interdisciplinary book presents advances in the fields of parallel, distributed and emergent information processing and computation. The book represents major breakthroughs in parallel quantum protocols, elastic cloud servers, structural properties of interconnection networks, internet of things, morphogenetic collective systems, swarm intelligence and cellular automata, unconventionality in parallel computation, algorithmic information dynamics, localized DNA computation, graph-based cryptography, slime mold inspired nano-electronics and cytoskeleton computers. Features Truly interdisciplinary, spanning computer science, electronics, mathematics and biology Covers widely popular topics of future and emergent computing technologies, cloud computing, parallel computing, DNA computation, security and network analysis, cryptography, and theoretical computer science Provides unique chapters written by top experts in theoretical and applied computer science, information processing and engineering From Parallel to Emergent Computing provides a visionary statement on how computing will advance in the next 25 years and what new fields of science will be involved in computing engineering. This book is a valuable resource for computer scientists working today, and in years to come.
As the 2020 global lockdown became a universal strategy to control the COVID-19 pandemic, social distancing triggered a massive reliance on online and cyberspace alternatives and switched the world to the digital economy. Despite their effectiveness for remote work and online interactions, cyberspace alternatives ignited several Cybersecurity challenges. Malicious hackers capitalized on global anxiety and launched cyberattacks against unsuspecting victims. Internet fraudsters exploited human and system vulnerabilities and impacted data integrity, privacy, and digital behaviour. Cybersecurity in the COVID-19 Pandemic demystifies Cybersecurity concepts using real-world cybercrime incidents from the pandemic to illustrate how threat actors perpetrated computer fraud against valuable information assets particularly healthcare, financial, commercial, travel, academic, and social networking data. The book simplifies the socio-technical aspects of Cybersecurity and draws valuable lessons from the impacts COVID-19 cyberattacks exerted on computer networks, online portals, and databases. The book also predicts the fusion of Cybersecurity into Artificial Intelligence and Big Data Analytics, the two emerging domains that will potentially dominate and redefine post-pandemic Cybersecurity research and innovations between 2021 and 2025. The book's primary audience is individual and corporate cyberspace consumers across all professions intending to update their Cybersecurity knowledge for detecting, preventing, responding to, and recovering from computer crimes. Cybersecurity in the COVID-19 Pandemic is ideal for information officers, data managers, business and risk administrators, technology scholars, Cybersecurity experts and researchers, and information technology practitioners. Readers will draw lessons for protecting their digital assets from email phishing fraud, social engineering scams, malware campaigns, and website hijacks.
With the advent of the IT revolution, the volume of data produced has increased exponentially and is still showing an upward trend. This data may be abundant and enormous, but it's a precious resource and should be managed properly. Cloud technology plays an important role in data management. Storing data in the cloud rather than on local storage has many benefits, but apart from these benefits, there are privacy concerns in storing sensitive data over third-party servers. These concerns can be addressed by storing data in an encrypted form; however, while encryption solves the problem of privacy, it engenders other serious issues, including the infeasibility of the fundamental search operation and a reduction in flexibility when sharing data with other users, amongst others. The concept of searchable encryption addresses these issues. This book provides every necessary detail required to develop a secure, searchable encryption scheme using both symmetric and asymmetric cryptographic primitives along with the appropriate security models to ensure the minimum security requirements for real-world applications.
Comprehensive and timely, Cloud Computing: Concepts and Technologies offers a thorough and detailed description of cloud computing concepts, architectures, and technologies, along with guidance on the best ways to understand and implement them. It covers the multi-core architectures, distributed and parallel computing models, virtualization, cloud developments, workload and Service-Level-Agreements (SLA) in cloud, workload management. Further, resource management issues in cloud with regard to resource provisioning, resource allocation, resource mapping and resource adaptation, ethical, non-ethical and security issues in cloud are followed by discussion of open challenges and future directions. This book gives students a comprehensive overview of the latest technologies and guidance on cloud computing, and is ideal for those studying the subject in specific modules or advanced courses. It is designed in twelve chapters followed by laboratory setups and experiments. Each chapter has multiple choice questions with answers, as well as review questions and critical thinking questions. The chapters are practically-focused, meaning that the information will also be relevant and useful for professionals wanting an overview of the topic.
Comprehensive and timely, Cloud Computing: Concepts and Technologies offers a thorough and detailed description of cloud computing concepts, architectures, and technologies, along with guidance on the best ways to understand and implement them. It covers the multi-core architectures, distributed and parallel computing models, virtualization, cloud developments, workload and Service-Level-Agreements (SLA) in cloud, workload management. Further, resource management issues in cloud with regard to resource provisioning, resource allocation, resource mapping and resource adaptation, ethical, non-ethical and security issues in cloud are followed by discussion of open challenges and future directions. This book gives students a comprehensive overview of the latest technologies and guidance on cloud computing, and is ideal for those studying the subject in specific modules or advanced courses. It is designed in twelve chapters followed by laboratory setups and experiments. Each chapter has multiple choice questions with answers, as well as review questions and critical thinking questions. The chapters are practically-focused, meaning that the information will also be relevant and useful for professionals wanting an overview of the topic.
For more than a decade, the focus of information technology has been on capturing and sharing data from a patient within an all-encompassing record (a.k.a. the electronic health record, EHR), to promote improved longitudinal oversight in the care of the patient. There are both those who agree and those who disagree as to whether this goal has been met, but it is certainly evolving. A key element to improved patient care has been the automated capture of data from durable medical devices that are the source of (mostly) objective data, from imagery to time-series histories of vital signs and spot-assessments of patients. The capture and use of these data to support clinical workflows have been written about and thoroughly debated. Yet, the use of these data for clinical guidance has been the subject of various papers published in respected medical journals, but without a coherent focus on the general subject of the clinically actionable benefits of objective medical device data for clinical decision-making purposes. Hence, the uniqueness of this book is in providing a single point-of-capture for the targeted clinical benefits of medical device data--both electronic- health-record-based and real-time--for improved clinical decision-making at the point of care, and for the use of these data to address and assess specific types of clinical surveillance. Clinical Surveillance: The Actionable Benefits of Objective Medical Device Data for Crucial Decision-Making focuses on the use of objective, continuously collected medical device data for the purpose of identifying patient deterioration, with a primary focus on those data normally obtained from both the higher-acuity care settings in intensive care units and the lower-acuity settings of general care wards. It includes examples of conditions that demonstrate earlier signs of deterioration including systemic inflammatory response syndrome, opioid-induced respiratory depression, shock induced by systemic failure, and more. The book provides education on how to use these data, such as for clinical interventions, in order to identify examples of how to guide care using automated durable medical device data from higher- and lower-acuity care settings. The book also includes real-world examples of applications that are of high value to clinical end-users and health systems.
This book constitutes the refereed proceedings of the 16th IFIP WG 9.4 International Conference on Social Implications of Computers in Developing Countries, ICT4D 2020, which was supposed to be held in Salford, UK, in June 2020, but was held virtually instead due to the COVID-19 pandemic. The 18 revised full papers presented were carefully reviewed and selected from 29 submissions. The papers present a wide range of perspectives and disciplines including (but not limited to) public administration, entrepreneurship, business administration, information technology for development, information management systems, organization studies, philosophy, and management. They are organized in the following topical sections: digital platforms and gig economy; education and health; inclusion and participation; and business innovation and data privacy.
Based on the technical accumulation and practice of Huawei iLab in the Cloud VR field, this book systematically describes the advantages of Cloud VR technologies; technical requirements on clouds, networks, and terminals as well as solution implementation; Cloud VR experience evaluation baselines and methods; and current business practices. Cloud VR introduces cloud computing and cloud rendering to VR services. With fast and stable networks, cloud-based display output and audio output are coded, compressed, and transmitted to user terminals, implementing cloud-based VR service content and content rendering. Cloud VR has stringent requirements on bandwidth and latency, making it a proficient application for 5G and gigabit home broadband networks in the era of "dual G". As the first advocate of Cloud VR, Huawei iLab developed the first prototype of the Cloud VR technical solution, initiated the industry's first Cloud VR industry cooperation plan - VR OpenLab with partners - and incubated the world's first Cloud VR commercial project with China Mobile Fujian. Cloud VR: Technology and Application is the first official publication of Huawei iLab's research and practice achievements. It systematically and thoroughly introduces the Cloud VR concept, solution architecture, key technologies, and business practices and is of great value in academic and social applications. This book is easy to understand, practical, and suitable for VR vendors, VR technology enthusiasts, carriers, network vendors, cloud service providers, universities, and other enterprises and scientific research institutes.
How can we recruit out of your program? We have a project - how do we reach out to your students? If we do research together who owns it? We have employees who need to "upskill" in analytics - can you help me with that? How much does all of this cost? Managers and executives are increasingly asking university professors such questions as they deal with a critical shortage of skilled data analysts. At the same time, academics are asking such questions as: How can I bring a "real" analytical project in the classroom? How can I get "real" data to help my students develop the skills necessary to be a "data scientist? Is what I am teaching in the classroom aligned with the demands of the market for analytical talent? After spending several years answering almost daily e-mails and telephone calls from business managers asking for staffing help and aiding fellow academics with their analytics teaching needs, Dr. Jennifer Priestley of Kennesaw State University and Dr. Robert McGrath of the University of New Hampshire wrote Closing the Analytics Talent Gap: An Executive's Guide to Working with Universities. The book builds a bridge between university analytics programs and business organizations. It promotes a dialog that enables executives to learn how universities can help them find strategically important personnel and universities to learn how they can develop and educate this personnel. Organizations are facing previously unforeseen challenges related to the translation of massive amounts of data - structured and unstructured, static and in-motion, voice, text, and image - into information to solve current challenges and anticipate new ones. The advent of analytics and data science also presents universities with unforeseen challenges of providing learning through application. This book helps both organizations with finding "data natives" and universities with educating students to develop the facility to work in a multi-faceted and complex data environment. .
How can we recruit out of your program? We have a project - how do we reach out to your students? If we do research together who owns it? We have employees who need to "upskill" in analytics - can you help me with that? How much does all of this cost? Managers and executives are increasingly asking university professors such questions as they deal with a critical shortage of skilled data analysts. At the same time, academics are asking such questions as: How can I bring a "real" analytical project in the classroom? How can I get "real" data to help my students develop the skills necessary to be a "data scientist? Is what I am teaching in the classroom aligned with the demands of the market for analytical talent? After spending several years answering almost daily e-mails and telephone calls from business managers asking for staffing help and aiding fellow academics with their analytics teaching needs, Dr. Jennifer Priestley of Kennesaw State University and Dr. Robert McGrath of the University of New Hampshire wrote Closing the Analytics Talent Gap: An Executive's Guide to Working with Universities. The book builds a bridge between university analytics programs and business organizations. It promotes a dialog that enables executives to learn how universities can help them find strategically important personnel and universities to learn how they can develop and educate this personnel. Organizations are facing previously unforeseen challenges related to the translation of massive amounts of data - structured and unstructured, static and in-motion, voice, text, and image - into information to solve current challenges and anticipate new ones. The advent of analytics and data science also presents universities with unforeseen challenges of providing learning through application. This book helps both organizations with finding "data natives" and universities with educating students to develop the facility to work in a multi-faceted and complex data environment. .
Modern network systems such as Internet of Things, Smart Grid, VoIP traffic, Peer-to-Peer protocol, and social networks, are inherently complex. They require powerful and realistic models and tools not only for analysis and simulation but also for prediction. This book covers important topics and approaches related to the modeling and simulation of complex communication networks from a complex adaptive systems perspective. The authors present different modeling paradigms and approaches as well as surveys and case studies. With contributions from an international panel of experts, this book is essential reading for networking, computing, and communications professionals, researchers and engineers in the field of next generation networks and complex information and communication systems, and academics and advanced students working in these fields.
Every day, millions of people are unaware of the amazing processes that take place when using their phones, connecting to broadband internet, watching television, or even the most basic action of flipping on a light switch. Advances are being continually made in not only the transmission of this data but also in the new methods of receiving it. These advancements come from many different sources and from engineers who have engaged in research, design, development, and implementation of electronic equipment used in communications systems. This volume addresses a selection of important current advancements in the electronics and communications engineering fields, focusing on signal processing, chip design, and networking technology. The sections in the book cover: Microwave and antennas Communications systems Very large-scale integration Embedded systems Intelligent control and signal processing systems
Citrix XenDesktop Implementation explores the implementation of Citrix XenDesktop, a virtual desktop infrastructure solution. After introducing the desktop virtualization, the book discusses the installation of a desktop delivery controller through advanced XenDesktop Client Settings. This book briefly discusses the work of desktop delivery controller mechanisms followed by its installation process, integration process of XenDesktop with Microsoft Active Directory, and the configuration of the desktop delivery controller. It then examines the process of installing the virtual desktop onto the server infrastructure, and it follows the installation and integration onto Xen Server, Hyper-V, and VMware hypervisors. Furthermore, it discusses the advanced configuration settings. The book covers the installation of the Citrix Provisioning Server and its fundamental configuration. It also explores the configuration of Citrix XenApp for Application provisioning, the integration of virtual applications, and the implementation of virtual profiles into the virtual desktop. The book concludes by explaining the advanced XenDesktop client settings on audio, video, and peripherals.
This comprehensive text/reference examines the various challenges to secure, efficient and cost-effective next-generation wireless networking. Topics and features: presents the latest advances, standards and technical challenges in a broad range of emerging wireless technologies; discusses cooperative and mesh networks, delay tolerant networks, and other next-generation networks such as LTE; examines real-world applications of vehicular communications, broadband wireless technologies, RFID technology, and energy-efficient wireless communications; introduces developments towards the 'Internet of Things' from both a communications and a service perspective; discusses the machine-to-machine communication model, important applications of wireless technologies in healthcare, and security issues in state-of-the-art networks.
This book constitutes the refereed post-conference proceedings of the 4th International Conference on Intelligence Science, ICIS 2020, held in Durgapur, India, in February 2021 (originally November 2020). The 23 full papers and 4 short papers presented were carefully reviewed and selected from 42 submissions. One extended abstract is also included. They deal with key issues in brain cognition; uncertain theory; machine learning; data intelligence; language cognition; vision cognition; perceptual intelligence; intelligent robot; and medical artificial intelligence.
This book offers a rigorous analysis of the achievements in the field of traffic control in large networks, oriented on two main aspects: the self-similarity in traffic behaviour and the scale-free characteristic of a complex network. Additionally, the authors propose a new insight in understanding the inner nature of things, and the cause-and-effect based on the identification of relationships and behaviours within a model, which is based on the study of the influence of the topological characteristics of a network upon the traffic behaviour. The effects of this influence are then discussed in order to find new solutions for traffic monitoring and diagnosis and also for traffic anomalies prediction. Although these concepts are illustrated using highly accurate, highly aggregated packet traces collected on backbone Internet links, the results of the analysis can be applied for any complex network whose traffic processes exhibit asymptotic self-similarity, perceived as an adaptability of traffic in networks. However, the problem with self-similar models is that they are computationally complex. Their fitting procedure is very time-consuming, while their parameters cannot be estimated based on the on-line measurements. In this aim, the main objective of this book is to discuss the problem of traffic prediction in the presence of self-similarity and particularly to offer a possibility to forecast future traffic variations and to predict network performance as precisely as possible, based on the measured traffic history.
Software Defined Networking: Design and Deployment provides a comprehensive treatment of software defined networking (SDN) suitable for new network managers and experienced network professionals. Presenting SDN in context with more familiar network services and challenges, this accessible text: Explains the importance of virtualization, particularly the impact of virtualization on servers and networks Addresses SDN, with an emphasis on the network control plane Discusses SDN implementation and the impact on service providers, legacy networks, and network vendors Contains a case study on Google's initial implementation of SDN Investigates OpenFlow, the hand-in-glove partner of SDN Looks forward toward more programmable networks and the languages needed to manage these environments Software Defined Networking: Design and Deployment offers a unique perspective of the business case and technology motivations for considering SDN solutions. By identifying the impact of SDN on traffic management and the potential for network service growth, this book instills the knowledge needed to manage current and future demand and provisioning for SDN.
Nonfunctional Requirements in Mobile Application Development is an empirical study that investigates how nonfunctional requirements--as compared with functional requirements--are treated by the software engineers during mobile application development. The book empirically analyzes the contribution of nonfunctional requirements to project parameters such as cost, time, and quality. Such parameters are of prime interest as they determine the survival of organizations in highly dynamic environments. The impact of nonfunctional requirements on project success is analyzed through surveys and case studies, both individually and relative to each other. Sources for data collection include industry, academia, and literature. The book also empirically studies the impact of nonfunctional requirements on the overall business success of both the software development firm and the software procuring firm. Project success is examined to determine if it leads to business success. The book provides rich empirical evidence to place nonfunctional requirements on par with functional requirements to achieve business success in highly competitive markets. This work enhances the body of knowledge through multiple empirical research methods including surveys, case studies, and experimentation to study software engineers' focus on nonfunctional requirements at both project and business levels. The book can guide both computer scientists and business managers in devising theoretical and technical solutions for software release planning to achieve business success.
This book serves three basic purposes: (1) a tutorial-type reference for complex systems engineering (CSE) concepts and associated terminology, (2) a recommendation of a proposed methodology showing how the evolving practice of CSE can lead to a more unified theory, and (3) a complex systems (CSs) initiative for organizations to invest some of their resources toward helping to make the world a better place. A wide variety of technical practitioners-e.g., developers of new or improved systems (particularly systems engineers), program and project managers, associated staff/workers, funders and overseers, government executives, military officers, systems acquisition personnel, contract specialists, owners of large and small businesses, professional society members, and CS researchers-may be interested in further exploring these topics. Readers will learn more about CS characteristics and behaviors and CSE principles and will therefore be able to focus on techniques that will better serve them in their everyday work environments in dealing with complexity. The fundamental observation is that many systems inherently involve a deeper complexity because stakeholders are engaged in the enterprise. This means that such CSs are more difficult to invent, create, or improve upon because no one can be in total control since people cannot be completely controlled. Therefore, one needs to concentrate on trying to influence progress, then wait a suitable amount of time to see what happens, iterating as necessary. With just three chapters in this book, it seems to make sense to provide a tutorial introduction that readers can peruse only as necessary, considering their background and understanding, then a chapter laying out the suggested artifacts and methodology, followed by a chapter emphasizing worthwhile areas of application.
Future technology information technology stands for all of continuously evolving and converging information technologies, including digital convergence, multimedia convergence, intelligent applications, embedded systems, mobile and wireless communications, bio-inspired computing, grid and cloud computing, semantic web, user experience and HCI, security and trust computing and so on, for satisfying our ever-changing needs. In past twenty five years or so, Information Technology (IT) influenced and changed every aspect of our lives and our cultures. These proceedings foster the dissemination of state-of-the-art research in all future IT areas, including their models, services, and novel applications associated with their utilization. |
You may like...
Practical Industrial Data Networks…
Steve Mackay, Edwin Wright, …
Paperback
R1,452
Discovery Miles 14 520
Research Anthology on Architectures…
Information R Management Association
Hardcover
R12,630
Discovery Miles 126 300
Opinion Mining and Text Analytics on…
Pantea Keikhosrokiani, Moussa Pourya Asl
Hardcover
R9,276
Discovery Miles 92 760
CCNA 200-301 Network Simulator
Sean Wilkins
Digital product license key
R2,877
Discovery Miles 28 770
The Host in the Machine - Examining the…
Angela Thomas-Jones
Paperback
R1,318
Discovery Miles 13 180
|