![]() |
![]() |
Your cart is empty |
||
Books > Computing & IT > Computer hardware & operating systems > Systems management
This book provides emergent knowledge relating to physical, cyber, and human risk mitigation in a practical and readable approach for the corporate environment. It presents and discusses practical applications of risk management techniques along with useable practical policy change options. This practical organizational security management approach examines multiple aspects of security to protect against physical, cyber, and human risk. A practical more tactical focus includes managing vulnerabilities and applying countermeasures. The book guides readers to a greater depth of understanding and action-oriented options.
This book illustrates the importance of business impact analysis, which covers risk assessment, and moves towards better understanding of the business environment, industry specific compliance, legal and regulatory landscape and the need for business continuity. The book provides charts, checklists and flow diagrams that give the roadmap to collect, collate and analyze data, and give enterprise management the entire mapping for controls that comprehensively covers all compliance that the enterprise is subject to have. The book helps professionals build a control framework tailored for an enterprise that covers best practices and relevant standards applicable to the enterprise. Presents a practical approach to assessing security, performance and business continuity needs of the enterprise Helps readers understand common objectives for audit, compliance, internal/external audit and assurance. Demonstrates how to build a customized controls framework that fulfills common audit criteria, business resilience needs and internal monitoring for effectiveness of controls Presents an Integrated Audit approach to fulfill all compliance requirements
This book explores a broad cross section of research and actual case studies to draw out new insights that may be used to build a benchmark for IT security professionals. This research takes a deeper dive beneath the surface of the analysis to uncover novel ways to mitigate data security vulnerabilities, connect the dots and identify patterns in the data on breaches. This analysis will assist security professionals not only in benchmarking their risk management programs but also in identifying forward looking security measures to narrow the path of future vulnerabilities.
This title includes a number of Open Access chapters. Model-driven engineering (MDE) is the automatic production of software from simplified models of structure and functionality. It mainly involves the automation of the routine and technologically complex programming tasks, thus allowing developers to focus on the true value-adding functionality that the system needs to deliver. This book serves an overview of some of the core topics in MDE. The volume is broken into two sections offering a selection of papers that helps the reader not only understand the MDE principles and techniques, but also learn from practical examples. Also covered are the following topics: * MDE for software product lines * Formal methods for model transformation correctness * Metamodeling with Eclipse eCore * Metamodeling with UML profiles * Test cases generation This easily accessible reference volume offers a comprehensive guide to this rapidly expanding field. Edited by experienced writers with experience in both research and the practice of software engineering, Model-Driven Engineering of Information Systems: Principles, Techniques and Practice is an authoritative and easy-to-use reference, ideal for both researchers in the field and students who wish to gain an overview to this important field of study.
Blockchain is a technology that has attracted the attention of all types of businesses. Cryptocurrency such as Bitcoin has gained the most attention, but now companies are applying Blockchain technology to develop solutions improving traditional applications and securing all types of transactions. Robust and innovative, this technology is being combined with other well-known technologies including Cloud Computing, Big Data, and IoT to revolutionize outcomes in all verticals. Unlike books focused on financial applications, Essential Enterprise Blockchain Concepts and Applications is for researchers and practitioners who are looking for secure, viable, low-cost, and workable applications to solve a broad range of business problems. The book presents research that rethinks how to incorporate Blockchain with existing technology. Chapters cover various applications based on Blockchain technology including: Digital voting Smart contracts Supply chain management Internet security Logistics management Identity management Securing medical devices Asset management Blockchain plays a significant role in providing security for data operations. It defines how trusted transactions can be carried out and addresses Internet vulnerability problems. Blockchain solves the security fault line between AI and IoT in smart systems as well as in other systems using devices connected to each other through public networks. Linear and permanent indexed records are maintained by Blockchain to face the vulnerability issues in a wide variety applications. In addition to applications, the book also covers consensus algorithms and protocols and performance of Blockchain algorithms.
Internet attack on computer systems is pervasive. It can take from less than a minute to as much as eight hours for an unprotected machine connected to the Internet to be completely compromised. It is the information security architect's job to prevent attacks by securing computer systems. This book describes both the process and the practice of assessing a computer system's existing information security posture. Detailing the time-tested practices of experienced security architects, it explains how to deliver the right security at the right time in the implementation lifecycle. Securing Systems: Applied Security Architecture and Threat Models covers all types of systems, from the simplest applications to complex, enterprise-grade, hybrid cloud architectures. It describes the many factors and prerequisite information that can influence an assessment. The book covers the following key aspects of security analysis: When should the security architect begin the analysis? At what points can a security architect add the most value? What are the activities the architect must execute? How are these activities delivered? What is the set of knowledge domains applied to the analysis? What are the outputs? What are the tips and tricks that make security architecture risk assessment easier? To help you build skill in assessing architectures for security, the book presents six sample assessments. Each assessment examines a different type of system architecture and introduces at least one new pattern for security analysis. The goal is that after you've seen a sufficient diversity of architectures, you'll be able to understand varied architectures and can better see the attack surfaces and prescribe security solutions.
DataOps is a new way of delivering data and analytics that is proven to get results. It enables IT and users to collaborate in the delivery of solutions that help organisations to embrace a data-driven culture. The DataOps Revolution: Delivering the Data-Driven Enterprise is a narrative about real world issues involved in using DataOps to make data-driven decisions in modern organisations. The book is built around real delivery examples based on the author's own experience and lays out principles and a methodology for business success using DataOps. Presenting practical design patterns and DataOps approaches, the book shows how DataOps projects are run and presents the benefits of using DataOps to implement data solutions. Best practices are introduced in this book through the telling of a story, which relates how a lead manager must find a way through complexity to turn an organisation around. This narrative vividly illustrates DataOps in action, enabling readers to incorporate best practices into everyday projects. The book tells the story of an embattled CIO who turns to a new and untested project manager charged with a wide remit to roll out DataOps techniques to an entire organisation. It illustrates a different approach to addressing the challenges in bridging the gap between IT and the business. The approach presented in this story lines up to the six IMPACT pillars of the DataOps model that Kinaesis (www.kinaesis.com) has been using through its consultants to deliver successful projects and turn around failing deliveries. The pillars help to organise thinking and structure an approach to project delivery. The pillars are broken down and translated into steps that can be applied to real-world projects that can deliver satisfaction and fulfillment to customers and project team members.
Using the SARS-CoV-2/CoVID-19 pandemic as a giant case study, and following the structure of the domains of information security, this book looks at what the crisis teaches us about security. It points out specific security fundamentals where social, medical, or business responses to the crisis failed or needed to make specific use of those concepts. For the most part, these lessons are simply reminders of factors that get neglected during times of non-crisis. The lessons particularly point out the importance of planning and resilience in systems and business. Those studying cybersecurity and its preventive measures and applications, as well as those involved in risk management studies and assessments, will all benefit greatly from the book. Robert Slade has had an extensive and prolific career in management, security, and telecommunications research, analysis, and consultancy. He has served as an educator visiting universities and delivering lecturers and seminars.
A Practical Introduction to Enterprise Network and Security Management, Second Edition, provides a balanced understanding of introductory and advanced subjects in both computer networking and cybersecurity. Although much of the focus is on technical concepts, managerial issues related to enterprise network and security planning and design are explained from a practitioner's perspective. Because of the critical importance of cybersecurity in today's enterprise networks, security-related issues are explained throughout the book, and four chapters are dedicated to fundamental knowledge. Challenging concepts are explained so readers can follow through with careful reading. This book is written for those who are self-studying or studying information systems or computer science in a classroom setting. If used for a course, it has enough material for a semester or a quarter. FEATURES Provides both theoretical and practical hands-on knowledge and learning experiences for computer networking and cybersecurity Offers a solid knowledge base for those preparing for certificate tests, such as CompTIA and CISSP Takes advantage of actual cases, examples, industry products, and services so students can relate concepts and theories to practice Explains subjects in a systematic and practical manner to facilitate understanding Includes practical exercise questions that can be individual or group assignments within or without a classroom Contains several information-rich screenshots, figures, and tables carefully constructed to solidify concepts and enhance visual learning The text is designed for students studying information systems or computer science for the first time. As a textbook, this book includes hands-on assignments based on the Packet Tracer program, an excellent network design and simulation tool from Cisco. Instructor materials also are provided, including PowerPoint slides, solutions for exercise questions, and additional chapter questions from which to build tests.
The past decade has seen a dramatic increase in the amount and variety of information that is generated and stored electronically by business enterprises. Storing this increased volume of information has not been a problem to date, but as these information stores grow larger and larger, multiple challenges arise for senior management: namely, questions such as "How much is our data worth?" "Are we storing our data in the most cost-effective way?" "Are we managing our data effectively and efficiently?" "Do we know which data is most important?" "Are we extracting business insight from the right data?" "Are our data adding to the value of our business?" "Are our data a liability?" "What is the potential for monetizing our data?" and "Do we have an appropriate risk management plan in place to protect our data?" To answer these value-based questions, data must be treated with the same rigor and discipline as other tangible and intangible assets. In other words, corporate data should be treated as a potential asset and should have its own asset valuation methodology that is accepted by the business community, the accounting and valuation community, and other important stakeholder groups. Valuing Data: An Open Framework is a first step in that direction. Its purpose is to: Provide the reader with some background on the nature of data Present the common categories of business data Explain the importance of data management Report the current thinking on data valuation Offer some business reasons to value data Present an "open framework"-along with some proposed methods-for valuing data The book does not aim to prescribe exactly how data should be valued monetarily, but rather it is a "starting point" for a discussion of data valuation with the objective of developing a stakeholder consensus, which, in turn, will become accepted standards and practices.
As digital technologies occupy a more central role in working and everyday human life, individual and social realities are increasingly constructed and communicated through digital objects, which are progressively replacing and representing physical objects. They are even shaping new forms of virtual reality. This growing digital transformation coupled with technological evolution and the development of computer computation is shaping a cyber society whose working mechanisms are grounded upon the production, deployment, and exploitation of big data. In the arts and humanities, however, the notion of big data is still in its embryonic stage, and only in the last few years, have arts and cultural organizations and institutions, artists, and humanists started to investigate, explore, and experiment with the deployment and exploitation of big data as well as understand the possible forms of collaborations based on it. Big Data in the Arts and Humanities: Theory and Practice explores the meaning, properties, and applications of big data. This book examines therelevance of big data to the arts and humanities, digital humanities, and management of big data with and for the arts and humanities. It explores the reasons and opportunities for the arts and humanities to embrace the big data revolution. The book also delineates managerial implications to successfully shape a mutually beneficial partnership between the arts and humanities and the big data- and computational digital-based sciences. Big data and arts and humanities can be likened to the rational and emotional aspects of the human mind. This book attempts to integrate these two aspects of human thought to advance decision-making and to enhance the expression of the best of human life.
As the 2020 global lockdown became a universal strategy to control the COVID-19 pandemic, social distancing triggered a massive reliance on online and cyberspace alternatives and switched the world to the digital economy. Despite their effectiveness for remote work and online interactions, cyberspace alternatives ignited several Cybersecurity challenges. Malicious hackers capitalized on global anxiety and launched cyberattacks against unsuspecting victims. Internet fraudsters exploited human and system vulnerabilities and impacted data integrity, privacy, and digital behaviour. Cybersecurity in the COVID-19 Pandemic demystifies Cybersecurity concepts using real-world cybercrime incidents from the pandemic to illustrate how threat actors perpetrated computer fraud against valuable information assets particularly healthcare, financial, commercial, travel, academic, and social networking data. The book simplifies the socio-technical aspects of Cybersecurity and draws valuable lessons from the impacts COVID-19 cyberattacks exerted on computer networks, online portals, and databases. The book also predicts the fusion of Cybersecurity into Artificial Intelligence and Big Data Analytics, the two emerging domains that will potentially dominate and redefine post-pandemic Cybersecurity research and innovations between 2021 and 2025. The book's primary audience is individual and corporate cyberspace consumers across all professions intending to update their Cybersecurity knowledge for detecting, preventing, responding to, and recovering from computer crimes. Cybersecurity in the COVID-19 Pandemic is ideal for information officers, data managers, business and risk administrators, technology scholars, Cybersecurity experts and researchers, and information technology practitioners. Readers will draw lessons for protecting their digital assets from email phishing fraud, social engineering scams, malware campaigns, and website hijacks.
With the advent of the IT revolution, the volume of data produced has increased exponentially and is still showing an upward trend. This data may be abundant and enormous, but it's a precious resource and should be managed properly. Cloud technology plays an important role in data management. Storing data in the cloud rather than on local storage has many benefits, but apart from these benefits, there are privacy concerns in storing sensitive data over third-party servers. These concerns can be addressed by storing data in an encrypted form; however, while encryption solves the problem of privacy, it engenders other serious issues, including the infeasibility of the fundamental search operation and a reduction in flexibility when sharing data with other users, amongst others. The concept of searchable encryption addresses these issues. This book provides every necessary detail required to develop a secure, searchable encryption scheme using both symmetric and asymmetric cryptographic primitives along with the appropriate security models to ensure the minimum security requirements for real-world applications.
Software-Defined Data Infrastructures Essentials provides fundamental coverage of physical, cloud, converged, and virtual server storage I/O networking technologies, trends, tools, techniques, and tradecraft skills. From webscale, software-defined, containers, database, key-value store, cloud, and enterprise to small or medium-size business, the book is filled with techniques, and tips to help develop or refine your server storage I/O hardware, software, and services skills. Whether you are new to data infrastructures or a seasoned pro, you will find this comprehensive reference indispensable for gaining as well as expanding experience with technologies, tools, techniques, and trends. We had a front row seat watching Greg present live in our education workshop seminar sessions for ITC professionals in the Netherlands material that is in this book. We recommend this amazing book to expand your converged and data infrastructure knowledge from beginners to industry veterans. -Gert and Frank Brouwer, Brouwer Storage Consultancy Software-Defined Data Infrastructures Essentials provides the foundational building blocks to improve your craft in serval areas including applications, clouds, legacy, and more. IT professionals, as well as sales professionals and support personnel, stand to gain a great deal by reading this book.-Mark McSherry, Oracle Regional Sales Manager Looking to expand your data infrastructure IQ? From CIOS to operations, sales to engineering, this book is a comprehensive reference, a must read for IT infrastructure professionals, beginners to seasoned experts.-Tom Becchetti, Advisory Systems Engineer Greg Schulz has provided a complete 'toolkit' for storage management along with the background and framework for the storage or data infrastructure professional or those aspiring to become one.-Greg Brunton, Experienced Storage and Data Management Professional
This book offers a practical insight to leaders who need to make good decisions in risky and important situations. The authors describe a process for making risk-intelligent decisions, explaining complex ideas simply, and mapping a route through the myriad interrelated influences when groups make decisions that matter. The approach puts the decision maker-you-at the center and explains how you can think and act differently to make better decisions more of the time. The book shows how to Determine the appropriate level of risk Make decisions in uncertain and turbulent conditions Understand how risks are perceived to identify them accurately Develop new behaviors to improve decision-making Making Risky and Important Decisions: A Leader's Guide builds on earlier ground-breaking publications from these two recognized thought leaders. Their first book together, Understanding and Managing Risk Attitude, brought together the language of risk and risk-taking with the language of emotional intelligence and emotional literacy. Managing Group Risk Attitude followed, and focused on decision-making groups, creating new insights and frameworks. Both books are positioned as specialist textbooks, despite their relevance to real-world situations. A Short Guide to Risk Appetite brought together the concepts of risk appetite and risk attitude into one place for the first time, cutting through confusing terminology and confused thinking to create a practical way of understanding "how much risk is too much risk." This latest installment from Ruth Murray-Webster and David Hillson takes the breadth of their previous work, adds new insights and thinking, and distills it into a highly usable guide for hard-pressed leaders.
"This book addresses how health apps, in-home measurement devices, telemedicine, data mining, and artificial intelligence and smart medical algorithms are all enabled by the transition to a digital health infrastructure.....it provides a comprehensive background with which to understand what is happening in healthcare informatics and why."-C. William Hanson, III, MD, Chief Medical Information Officer and Vice President, University of Pennsylvania Health System. "This book is dedicated to the frontline healthcare workers, who through their courage and honor to their profession, helped maintain a reliable service to the population at large, during a chaotic time. These individuals withstood fear and engaged massive uncertainty and risk to perform their duties of providing care to those in need at a time of crisis. May the world never forget the COVID-19 pandemic and the courage of our healthcare workers".-Stephan P. Kudyba, Author Healthcare Informatics: Evolving Strategies in the Digital Era focuses on the services, technologies, and processes that are evolving in the healthcare industry. It begins with an introduction to the factors that are driving the digital age as it relates to the healthcare sector and then covers strategic topics such as risk management, project management, and knowledge management that are essential for successful digital initiatives. It delves into facets of the digital economy and how healthcare is adapting to the geographic, demographic, and physical needs of the population and highlights the emergence and importance of apps and telehealth. It also provides a high-level approach to managing pandemics by applying the various elements of the digital ecosystem. The book covers such technologies as: Computerized physician order entry (CPOE) Clinical Information Systems Alerting systems and medical sensors Electronic healthcare records (EHRs) Mobile healthcare and telehealth. Apps Business Intelligence and Decision Support Analytics Digital outreach to the population Artificial Intelligence The book then closes the loop on the efficiency enhancing process with a focus on utilizing analytics for problem solving for a variety of healthcare processes including the pharmaceutical sector. Finally, the book ends with current and futuristic views on evolving applications of AI throughout the industry.
A staggering 70% of digital transformations have failed as per McKinsey. The key reason why enterprises are failing in their digital transformation journey is because there is no standard framework existing in the industry that enterprises can use to transform themselves to digital. There are several books that speak about technologies such as Cloud, Artificial Intelligence and Data Analytics in silos, but none of these provides a holistic view on how enterprises can embark on a digital transformation journey and be successful using a combination of these technologies. FORMULA 4.0 is a methodology that provides clear guidance for enterprises aspiring to transform their traditional operating model to digital. Enterprises can use this framework as a readymade guide and plan their digital transformation journey. This book is intended for all chief executives, software managers, and leaders who intend to successfully lead this digital transformation journey. An enterprise can achieve success in digital transformation only of it can create an IT Platform that will enable them to adopt any new technology seamlessly into existing IT estate; deliver new products and services to the market in shorter durations; make business decisions with IT as an enabler and utilize automation in all its major business and IT processes. Achieving these goals is what defines a digital enterprise -- Formula 4.0 is a methodology for enterprises to achieve these goals and become digital. Essentially, there is no existing framework in the market that provides a step-by-step guide to enterprises on how to embark on their successful digital transformation journey. This book enables such transformations. Overall, the Formula 4.0 is an enterprise digital transformation framework that enables organizations to become truly digital.
A staggering 70% of digital transformations have failed as per McKinsey. The key reason why enterprises are failing in their digital transformation journey is because there is no standard framework existing in the industry that enterprises can use to transform themselves to digital. There are several books that speak about technologies such as Cloud, Artificial Intelligence and Data Analytics in silos, but none of these provides a holistic view on how enterprises can embark on a digital transformation journey and be successful using a combination of these technologies. FORMULA 4.0 is a methodology that provides clear guidance for enterprises aspiring to transform their traditional operating model to digital. Enterprises can use this framework as a readymade guide and plan their digital transformation journey. This book is intended for all chief executives, software managers, and leaders who intend to successfully lead this digital transformation journey. An enterprise can achieve success in digital transformation only of it can create an IT Platform that will enable them to adopt any new technology seamlessly into existing IT estate; deliver new products and services to the market in shorter durations; make business decisions with IT as an enabler and utilize automation in all its major business and IT processes. Achieving these goals is what defines a digital enterprise -- Formula 4.0 is a methodology for enterprises to achieve these goals and become digital. Essentially, there is no existing framework in the market that provides a step-by-step guide to enterprises on how to embark on their successful digital transformation journey. This book enables such transformations. Overall, the Formula 4.0 is an enterprise digital transformation framework that enables organizations to become truly digital.
Coud reliability engineering is a leading issue of cloud services. Cloud service providers guarantee computation, storage and applications through service-level agreements (SLAs) for promised levels of performance and uptime. Cloud Reliability Engineering: Technologies and Tools presents case studies examining cloud services, their challenges, and the reliability mechanisms used by cloud service providers. These case studies provide readers with techniques to harness cloud reliability and availability requirements in their own endeavors. Both conceptual and applied, the book explains reliability theory and the best practices used by cloud service companies to provide high availability. It also examines load balancing, and cloud security. Written by researchers and practitioners, the book's chapters are a comprehensive study of cloud reliability and availability issues and solutions. Various reliability class distributions and their effects on cloud reliability are discussed. An important aspect of reliability block diagrams is used to categorize poor reliability of cloud infrastructures, where enhancement can be made to lower the failure rate of the system. This technique can be used in design and functional stages to determine poor reliability of a system and provide target improvements. Load balancing for reliability is examined as a migrating process or performed by using virtual machines. The approach employed to identify the lightly loaded destination node to which the processes/virtual machines migrate can be optimized by employing a genetic algorithm. To analyze security risk and reliability, a novel technique for minimizing the number of keys and the security system is presented. The book also provides an overview of testing methods for the cloud, and a case study discusses testing reliability, installability, and security. A comprehensive volume, Cloud Reliability Engineering: Technologies and Tools combines research, theory, and best practices used to engineer reliable cloud availability and performance.
How can we recruit out of your program? We have a project - how do we reach out to your students? If we do research together who owns it? We have employees who need to "upskill" in analytics - can you help me with that? How much does all of this cost? Managers and executives are increasingly asking university professors such questions as they deal with a critical shortage of skilled data analysts. At the same time, academics are asking such questions as: How can I bring a "real" analytical project in the classroom? How can I get "real" data to help my students develop the skills necessary to be a "data scientist? Is what I am teaching in the classroom aligned with the demands of the market for analytical talent? After spending several years answering almost daily e-mails and telephone calls from business managers asking for staffing help and aiding fellow academics with their analytics teaching needs, Dr. Jennifer Priestley of Kennesaw State University and Dr. Robert McGrath of the University of New Hampshire wrote Closing the Analytics Talent Gap: An Executive's Guide to Working with Universities. The book builds a bridge between university analytics programs and business organizations. It promotes a dialog that enables executives to learn how universities can help them find strategically important personnel and universities to learn how they can develop and educate this personnel. Organizations are facing previously unforeseen challenges related to the translation of massive amounts of data - structured and unstructured, static and in-motion, voice, text, and image - into information to solve current challenges and anticipate new ones. The advent of analytics and data science also presents universities with unforeseen challenges of providing learning through application. This book helps both organizations with finding "data natives" and universities with educating students to develop the facility to work in a multi-faceted and complex data environment. .
How can we recruit out of your program? We have a project - how do we reach out to your students? If we do research together who owns it? We have employees who need to "upskill" in analytics - can you help me with that? How much does all of this cost? Managers and executives are increasingly asking university professors such questions as they deal with a critical shortage of skilled data analysts. At the same time, academics are asking such questions as: How can I bring a "real" analytical project in the classroom? How can I get "real" data to help my students develop the skills necessary to be a "data scientist? Is what I am teaching in the classroom aligned with the demands of the market for analytical talent? After spending several years answering almost daily e-mails and telephone calls from business managers asking for staffing help and aiding fellow academics with their analytics teaching needs, Dr. Jennifer Priestley of Kennesaw State University and Dr. Robert McGrath of the University of New Hampshire wrote Closing the Analytics Talent Gap: An Executive's Guide to Working with Universities. The book builds a bridge between university analytics programs and business organizations. It promotes a dialog that enables executives to learn how universities can help them find strategically important personnel and universities to learn how they can develop and educate this personnel. Organizations are facing previously unforeseen challenges related to the translation of massive amounts of data - structured and unstructured, static and in-motion, voice, text, and image - into information to solve current challenges and anticipate new ones. The advent of analytics and data science also presents universities with unforeseen challenges of providing learning through application. This book helps both organizations with finding "data natives" and universities with educating students to develop the facility to work in a multi-faceted and complex data environment. .
The complex challenges facing healthcare require innovative solutions that can make patient care more effective, easily available, and affordable. One such solution is the digital reconstruction of medicine that transitions much of patient care from hospitals, clinics, and offices to a variety of virtual settings. This reconstruction involves telemedicine, hospital-at-home services, mobile apps, remote sensing devices, clinical data analytics, and other cutting-edge technologies. The Digital Reconstruction of Healthcare: Transitioning from Brick and Mortar to Virtual Care takes a deep dive into these tools and how they can transform medicine to meet the unique needs of patients across the globe. This book enables readers to peer into the very near future and prepare them for the opportunities afforded by the digital shift in healthcare. It is also a wake-up call to readers who are less than enthusiastic about these digital tools and helps them to realize the cost of ignoring these tools. It is written for a wide range of medical professionals including: Physicians, nurses, and entrepreneurs who want to understand how to use or develop digital products and services IT managers who need to fold these tools into existing computer networks at hospitals, clinics, and medical offices Healthcare executives who decide how to invest in these platforms and products Insurers who need to stay current on the latest trends and the evidence to support their cost effectiveness Filled with insights from international experts, this book also features Dr. John Halamka's lessons learned from years of international consulting with government officials on digital health. It also taps into senior research analyst Paul Cerrato's expertise in AI, data analytics, and machine learning. Combining these lessons learned with an in-depth analysis of clinical informatics research, this book aims to separate hyped AI "solutions" from evidence-based digital tools. Together, these two pillars support the contention that these technologies can, in fact, help solve many of the seemingly intractable problems facing healthcare providers and patients.
Assisting organizations in improving their project management processes, the Project Management Maturity Model defines the industry standard for measuring project management maturity and agile and adaptive capabilities. Project Management Maturity Model, Fourth Edition provides a roadmap showing organizations how to move to higher levels of organizational behavior, improving project success and organizational performance. It's a comprehensive tool for enhancing project management practices, covering areas critical to organizational improvement, such as the project management office, management oversight, and professional development. It also provides methods for optimizing project management processes and suggestions for deploying the model as a strategic tool in improving business outcomes. New material in each chapter also outlines good practices for implementing adaptive an agile processes. The book also includes the Project Portfolio Management Maturity Model, which covers best practices for determining portfolio maturity, setting short-term priorities, implementing benefits realization management, improving portfolio management processes and tracking progress. The author, J. Kent Crawford, CEO of PM Solutions, describes the basics of project management maturity, including the benefits of assessing maturity, and presents a comprehensive framework for improving organization's processes. Chapters are based on the ten project management knowledge areas specified in the Project Management Institute's standard, the PMBOK (R) Guide. This edition provides new and revised materials based on the PMBOK (R) Guide including a fresh focus on agile and adaptive methods, benefits realization, and organizational change management. Organizations can use this book to: Determine the maturity of your organization's project management processes Gauge readiness for agile transformation Map out a logical path to improve your organization's processes Set priorities for short-term process improvement Track and visualize improvements in project management over time Learn to translate process maturity into business results After an objective assessment, an organization can set its goals for increasing the capability of its processes and develop a plan for reaching those goals. This book is ideal for anyone involved with improving the capability of an organization's project and portfolio management processes.
Today, it is not uncommon for practices and hospitals to be on their second or third EHR and/or contemplating a transition from the traditional on-premise model to a cloud-based system. As a follow-up to Complete Guide and Toolkit to Successful EHR Adoption ( (c)2011 HIMSS), this book builds on the best practices of the first edition, fast-forwarding to the latest innovations that are currently leveraged and adopted by providers and hospitals. We examine the role that artificial intelligence (AI) is now playing in and around EHR technology. We also address the advances in analytics and deep learning (also known as deep structured or hierarchical learning) and explain this topic in practical ways for even the most novice reader to comprehend and apply. The challenges of EHR to EHR migrations and data conversions will also be covered, including the use of the unethical practice of data blocking used as a tactic by some vendors to hold data hostage. Further, we explore innovations related to interoperability, cloud computing, cyber security, and electronic patient/consumer engagement. Finally, this book will deal with what to do with aging technology and databases, which is an issue rarely considered in any of the early publications on healthcare technology. What is the proper way to retire a legacy system, and what are the legal obligations of data archiving? Though a lot has changed since the 2011 edition, many of the fundamentals remain the same and will serve as a foundation for the next generation of EHR adopters and/or those moving on to their second, third, fourth, and beyond EHRs.
Trust and Records in an Open Digital Environment explores issues that arise when digital records are entrusted to the cloud and will help professionals to make informed choices in the context of a rapidly changing digital economy. Showing that records need to ensure public trust, especially in the era of alternative truths, this volume argues that reliable resources, which are openly accessible from governmental institutions, e-services, archival institutions, digital repositories, and cloud-based digital archives, are the key to an open digital environment. The book also demonstrates that current established practices need to be reviewed and amended to include the networked nature of the cloud-based records, to investigate the role of new players, like cloud service providers (CSP), and assess the potential for implementing new, disruptive technologies like blockchain. Stancic and the contributors address these challenges by taking three themes - state, citizens, and documentary form - and discussing their interaction in the context of open government, open access, recordkeeping, and digital preservation. Exploring what is needed to enable the establishment of an open digital environment, Trust and Records in an Open Digital Environment should be essential reading for data, information, document, and records management professionals. It will also be a key text for archivists, librarians, professors, and students working in the information sciences and other related fields. |
![]() ![]() You may like...
Flash Memory Integration - Performance…
Jalil Boukhobza, Pierre Olivier
Hardcover
R1,942
Discovery Miles 19 420
Inclusive Radio Communications for 5G…
Claude Oestges, Francois Quitin
Paperback
R3,076
Discovery Miles 30 760
Handbook of Network and System…
Jan Bergstra, Mark Burgess
Hardcover
Meeting People via WiFi and Bluetooth
Joshua Schroeder, Henry Dalziel
Paperback
R821
Discovery Miles 8 210
A Study of Black Hole Attack Solutions…
Elahe Fazeldehkordi, I.S. Dr. Amiri, …
Paperback
R1,354
Discovery Miles 13 540
Intelligent Systems for Information…
B. Bouchon-Meunier, L. Foulloy, …
Hardcover
R3,529
Discovery Miles 35 290
|